Dec  5 06:03:41 np0005546909 kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec  5 06:03:41 np0005546909 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec  5 06:03:41 np0005546909 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  5 06:03:41 np0005546909 kernel: BIOS-provided physical RAM map:
Dec  5 06:03:41 np0005546909 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec  5 06:03:41 np0005546909 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec  5 06:03:41 np0005546909 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec  5 06:03:41 np0005546909 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec  5 06:03:41 np0005546909 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec  5 06:03:41 np0005546909 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec  5 06:03:41 np0005546909 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec  5 06:03:41 np0005546909 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec  5 06:03:41 np0005546909 kernel: NX (Execute Disable) protection: active
Dec  5 06:03:41 np0005546909 kernel: APIC: Static calls initialized
Dec  5 06:03:41 np0005546909 kernel: SMBIOS 2.8 present.
Dec  5 06:03:41 np0005546909 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec  5 06:03:41 np0005546909 kernel: Hypervisor detected: KVM
Dec  5 06:03:41 np0005546909 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec  5 06:03:41 np0005546909 kernel: kvm-clock: using sched offset of 3156555422 cycles
Dec  5 06:03:41 np0005546909 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec  5 06:03:41 np0005546909 kernel: tsc: Detected 2800.000 MHz processor
Dec  5 06:03:41 np0005546909 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec  5 06:03:41 np0005546909 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec  5 06:03:41 np0005546909 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec  5 06:03:41 np0005546909 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec  5 06:03:41 np0005546909 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec  5 06:03:41 np0005546909 kernel: Using GB pages for direct mapping
Dec  5 06:03:41 np0005546909 kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec  5 06:03:41 np0005546909 kernel: ACPI: Early table checksum verification disabled
Dec  5 06:03:41 np0005546909 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec  5 06:03:41 np0005546909 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  5 06:03:41 np0005546909 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  5 06:03:41 np0005546909 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  5 06:03:41 np0005546909 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec  5 06:03:41 np0005546909 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  5 06:03:41 np0005546909 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  5 06:03:41 np0005546909 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec  5 06:03:41 np0005546909 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec  5 06:03:41 np0005546909 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec  5 06:03:41 np0005546909 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec  5 06:03:41 np0005546909 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec  5 06:03:41 np0005546909 kernel: No NUMA configuration found
Dec  5 06:03:41 np0005546909 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec  5 06:03:41 np0005546909 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Dec  5 06:03:41 np0005546909 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec  5 06:03:41 np0005546909 kernel: Zone ranges:
Dec  5 06:03:41 np0005546909 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec  5 06:03:41 np0005546909 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec  5 06:03:41 np0005546909 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec  5 06:03:41 np0005546909 kernel:  Device   empty
Dec  5 06:03:41 np0005546909 kernel: Movable zone start for each node
Dec  5 06:03:41 np0005546909 kernel: Early memory node ranges
Dec  5 06:03:41 np0005546909 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec  5 06:03:41 np0005546909 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec  5 06:03:41 np0005546909 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec  5 06:03:41 np0005546909 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec  5 06:03:41 np0005546909 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec  5 06:03:41 np0005546909 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec  5 06:03:41 np0005546909 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec  5 06:03:41 np0005546909 kernel: ACPI: PM-Timer IO Port: 0x608
Dec  5 06:03:41 np0005546909 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec  5 06:03:41 np0005546909 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec  5 06:03:41 np0005546909 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec  5 06:03:41 np0005546909 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec  5 06:03:41 np0005546909 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec  5 06:03:41 np0005546909 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec  5 06:03:41 np0005546909 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec  5 06:03:41 np0005546909 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec  5 06:03:41 np0005546909 kernel: TSC deadline timer available
Dec  5 06:03:41 np0005546909 kernel: CPU topo: Max. logical packages:   8
Dec  5 06:03:41 np0005546909 kernel: CPU topo: Max. logical dies:       8
Dec  5 06:03:41 np0005546909 kernel: CPU topo: Max. dies per package:   1
Dec  5 06:03:41 np0005546909 kernel: CPU topo: Max. threads per core:   1
Dec  5 06:03:41 np0005546909 kernel: CPU topo: Num. cores per package:     1
Dec  5 06:03:41 np0005546909 kernel: CPU topo: Num. threads per package:   1
Dec  5 06:03:41 np0005546909 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec  5 06:03:41 np0005546909 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec  5 06:03:41 np0005546909 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec  5 06:03:41 np0005546909 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec  5 06:03:41 np0005546909 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec  5 06:03:41 np0005546909 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec  5 06:03:41 np0005546909 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec  5 06:03:41 np0005546909 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec  5 06:03:41 np0005546909 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec  5 06:03:41 np0005546909 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec  5 06:03:41 np0005546909 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec  5 06:03:41 np0005546909 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec  5 06:03:41 np0005546909 kernel: Booting paravirtualized kernel on KVM
Dec  5 06:03:41 np0005546909 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec  5 06:03:41 np0005546909 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec  5 06:03:41 np0005546909 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec  5 06:03:41 np0005546909 kernel: kvm-guest: PV spinlocks disabled, no host support
Dec  5 06:03:41 np0005546909 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  5 06:03:41 np0005546909 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec  5 06:03:41 np0005546909 kernel: random: crng init done
Dec  5 06:03:41 np0005546909 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec  5 06:03:41 np0005546909 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec  5 06:03:41 np0005546909 kernel: Fallback order for Node 0: 0 
Dec  5 06:03:41 np0005546909 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec  5 06:03:41 np0005546909 kernel: Policy zone: Normal
Dec  5 06:03:41 np0005546909 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec  5 06:03:41 np0005546909 kernel: software IO TLB: area num 8.
Dec  5 06:03:41 np0005546909 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec  5 06:03:41 np0005546909 kernel: ftrace: allocating 49335 entries in 193 pages
Dec  5 06:03:41 np0005546909 kernel: ftrace: allocated 193 pages with 3 groups
Dec  5 06:03:41 np0005546909 kernel: Dynamic Preempt: voluntary
Dec  5 06:03:41 np0005546909 kernel: rcu: Preemptible hierarchical RCU implementation.
Dec  5 06:03:41 np0005546909 kernel: rcu: #011RCU event tracing is enabled.
Dec  5 06:03:41 np0005546909 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec  5 06:03:41 np0005546909 kernel: #011Trampoline variant of Tasks RCU enabled.
Dec  5 06:03:41 np0005546909 kernel: #011Rude variant of Tasks RCU enabled.
Dec  5 06:03:41 np0005546909 kernel: #011Tracing variant of Tasks RCU enabled.
Dec  5 06:03:41 np0005546909 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec  5 06:03:41 np0005546909 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec  5 06:03:41 np0005546909 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  5 06:03:41 np0005546909 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  5 06:03:41 np0005546909 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  5 06:03:41 np0005546909 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec  5 06:03:41 np0005546909 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec  5 06:03:41 np0005546909 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec  5 06:03:41 np0005546909 kernel: Console: colour VGA+ 80x25
Dec  5 06:03:41 np0005546909 kernel: printk: console [ttyS0] enabled
Dec  5 06:03:41 np0005546909 kernel: ACPI: Core revision 20230331
Dec  5 06:03:41 np0005546909 kernel: APIC: Switch to symmetric I/O mode setup
Dec  5 06:03:41 np0005546909 kernel: x2apic enabled
Dec  5 06:03:41 np0005546909 kernel: APIC: Switched APIC routing to: physical x2apic
Dec  5 06:03:41 np0005546909 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec  5 06:03:41 np0005546909 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Dec  5 06:03:41 np0005546909 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec  5 06:03:41 np0005546909 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec  5 06:03:41 np0005546909 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec  5 06:03:41 np0005546909 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec  5 06:03:41 np0005546909 kernel: Spectre V2 : Mitigation: Retpolines
Dec  5 06:03:41 np0005546909 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec  5 06:03:41 np0005546909 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec  5 06:03:41 np0005546909 kernel: RETBleed: Mitigation: untrained return thunk
Dec  5 06:03:41 np0005546909 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec  5 06:03:41 np0005546909 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec  5 06:03:41 np0005546909 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec  5 06:03:41 np0005546909 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec  5 06:03:41 np0005546909 kernel: x86/bugs: return thunk changed
Dec  5 06:03:41 np0005546909 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec  5 06:03:41 np0005546909 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec  5 06:03:41 np0005546909 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec  5 06:03:41 np0005546909 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec  5 06:03:41 np0005546909 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec  5 06:03:41 np0005546909 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec  5 06:03:41 np0005546909 kernel: Freeing SMP alternatives memory: 40K
Dec  5 06:03:41 np0005546909 kernel: pid_max: default: 32768 minimum: 301
Dec  5 06:03:41 np0005546909 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec  5 06:03:41 np0005546909 kernel: landlock: Up and running.
Dec  5 06:03:41 np0005546909 kernel: Yama: becoming mindful.
Dec  5 06:03:41 np0005546909 kernel: SELinux:  Initializing.
Dec  5 06:03:41 np0005546909 kernel: LSM support for eBPF active
Dec  5 06:03:41 np0005546909 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  5 06:03:41 np0005546909 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  5 06:03:41 np0005546909 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec  5 06:03:41 np0005546909 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec  5 06:03:41 np0005546909 kernel: ... version:                0
Dec  5 06:03:41 np0005546909 kernel: ... bit width:              48
Dec  5 06:03:41 np0005546909 kernel: ... generic registers:      6
Dec  5 06:03:41 np0005546909 kernel: ... value mask:             0000ffffffffffff
Dec  5 06:03:41 np0005546909 kernel: ... max period:             00007fffffffffff
Dec  5 06:03:41 np0005546909 kernel: ... fixed-purpose events:   0
Dec  5 06:03:41 np0005546909 kernel: ... event mask:             000000000000003f
Dec  5 06:03:41 np0005546909 kernel: signal: max sigframe size: 1776
Dec  5 06:03:41 np0005546909 kernel: rcu: Hierarchical SRCU implementation.
Dec  5 06:03:41 np0005546909 kernel: rcu: #011Max phase no-delay instances is 400.
Dec  5 06:03:41 np0005546909 kernel: smp: Bringing up secondary CPUs ...
Dec  5 06:03:41 np0005546909 kernel: smpboot: x86: Booting SMP configuration:
Dec  5 06:03:41 np0005546909 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec  5 06:03:41 np0005546909 kernel: smp: Brought up 1 node, 8 CPUs
Dec  5 06:03:41 np0005546909 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Dec  5 06:03:41 np0005546909 kernel: node 0 deferred pages initialised in 10ms
Dec  5 06:03:41 np0005546909 kernel: Memory: 7763740K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 618208K reserved, 0K cma-reserved)
Dec  5 06:03:41 np0005546909 kernel: devtmpfs: initialized
Dec  5 06:03:41 np0005546909 kernel: x86/mm: Memory block size: 128MB
Dec  5 06:03:41 np0005546909 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec  5 06:03:41 np0005546909 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec  5 06:03:41 np0005546909 kernel: pinctrl core: initialized pinctrl subsystem
Dec  5 06:03:41 np0005546909 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec  5 06:03:41 np0005546909 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec  5 06:03:41 np0005546909 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec  5 06:03:41 np0005546909 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec  5 06:03:41 np0005546909 kernel: audit: initializing netlink subsys (disabled)
Dec  5 06:03:41 np0005546909 kernel: audit: type=2000 audit(1764932619.865:1): state=initialized audit_enabled=0 res=1
Dec  5 06:03:41 np0005546909 kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec  5 06:03:41 np0005546909 kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec  5 06:03:41 np0005546909 kernel: thermal_sys: Registered thermal governor 'user_space'
Dec  5 06:03:41 np0005546909 kernel: cpuidle: using governor menu
Dec  5 06:03:41 np0005546909 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec  5 06:03:41 np0005546909 kernel: PCI: Using configuration type 1 for base access
Dec  5 06:03:41 np0005546909 kernel: PCI: Using configuration type 1 for extended access
Dec  5 06:03:41 np0005546909 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec  5 06:03:41 np0005546909 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec  5 06:03:41 np0005546909 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec  5 06:03:41 np0005546909 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec  5 06:03:41 np0005546909 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec  5 06:03:41 np0005546909 kernel: Demotion targets for Node 0: null
Dec  5 06:03:41 np0005546909 kernel: cryptd: max_cpu_qlen set to 1000
Dec  5 06:03:41 np0005546909 kernel: ACPI: Added _OSI(Module Device)
Dec  5 06:03:41 np0005546909 kernel: ACPI: Added _OSI(Processor Device)
Dec  5 06:03:41 np0005546909 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec  5 06:03:41 np0005546909 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec  5 06:03:41 np0005546909 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec  5 06:03:41 np0005546909 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec  5 06:03:41 np0005546909 kernel: ACPI: Interpreter enabled
Dec  5 06:03:41 np0005546909 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec  5 06:03:41 np0005546909 kernel: ACPI: Using IOAPIC for interrupt routing
Dec  5 06:03:41 np0005546909 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec  5 06:03:41 np0005546909 kernel: PCI: Using E820 reservations for host bridge windows
Dec  5 06:03:41 np0005546909 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec  5 06:03:41 np0005546909 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec  5 06:03:41 np0005546909 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [3] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [4] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [5] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [6] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [7] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [8] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [9] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [10] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [11] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [12] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [13] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [14] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [15] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [16] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [17] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [18] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [19] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [20] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [21] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [22] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [23] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [24] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [25] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [26] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [27] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [28] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [29] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [30] registered
Dec  5 06:03:41 np0005546909 kernel: acpiphp: Slot [31] registered
Dec  5 06:03:41 np0005546909 kernel: PCI host bridge to bus 0000:00
Dec  5 06:03:41 np0005546909 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec  5 06:03:41 np0005546909 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec  5 06:03:41 np0005546909 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec  5 06:03:41 np0005546909 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec  5 06:03:41 np0005546909 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec  5 06:03:41 np0005546909 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec  5 06:03:41 np0005546909 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec  5 06:03:41 np0005546909 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec  5 06:03:41 np0005546909 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec  5 06:03:41 np0005546909 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec  5 06:03:41 np0005546909 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec  5 06:03:41 np0005546909 kernel: iommu: Default domain type: Translated
Dec  5 06:03:41 np0005546909 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec  5 06:03:41 np0005546909 kernel: SCSI subsystem initialized
Dec  5 06:03:41 np0005546909 kernel: ACPI: bus type USB registered
Dec  5 06:03:41 np0005546909 kernel: usbcore: registered new interface driver usbfs
Dec  5 06:03:41 np0005546909 kernel: usbcore: registered new interface driver hub
Dec  5 06:03:41 np0005546909 kernel: usbcore: registered new device driver usb
Dec  5 06:03:41 np0005546909 kernel: pps_core: LinuxPPS API ver. 1 registered
Dec  5 06:03:41 np0005546909 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec  5 06:03:41 np0005546909 kernel: PTP clock support registered
Dec  5 06:03:41 np0005546909 kernel: EDAC MC: Ver: 3.0.0
Dec  5 06:03:41 np0005546909 kernel: NetLabel: Initializing
Dec  5 06:03:41 np0005546909 kernel: NetLabel:  domain hash size = 128
Dec  5 06:03:41 np0005546909 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec  5 06:03:41 np0005546909 kernel: NetLabel:  unlabeled traffic allowed by default
Dec  5 06:03:41 np0005546909 kernel: PCI: Using ACPI for IRQ routing
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec  5 06:03:41 np0005546909 kernel: vgaarb: loaded
Dec  5 06:03:41 np0005546909 kernel: clocksource: Switched to clocksource kvm-clock
Dec  5 06:03:41 np0005546909 kernel: VFS: Disk quotas dquot_6.6.0
Dec  5 06:03:41 np0005546909 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec  5 06:03:41 np0005546909 kernel: pnp: PnP ACPI init
Dec  5 06:03:41 np0005546909 kernel: pnp: PnP ACPI: found 5 devices
Dec  5 06:03:41 np0005546909 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec  5 06:03:41 np0005546909 kernel: NET: Registered PF_INET protocol family
Dec  5 06:03:41 np0005546909 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec  5 06:03:41 np0005546909 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec  5 06:03:41 np0005546909 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec  5 06:03:41 np0005546909 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec  5 06:03:41 np0005546909 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec  5 06:03:41 np0005546909 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec  5 06:03:41 np0005546909 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec  5 06:03:41 np0005546909 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  5 06:03:41 np0005546909 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  5 06:03:41 np0005546909 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec  5 06:03:41 np0005546909 kernel: NET: Registered PF_XDP protocol family
Dec  5 06:03:41 np0005546909 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec  5 06:03:41 np0005546909 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec  5 06:03:41 np0005546909 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec  5 06:03:41 np0005546909 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec  5 06:03:41 np0005546909 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec  5 06:03:41 np0005546909 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec  5 06:03:41 np0005546909 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 76426 usecs
Dec  5 06:03:41 np0005546909 kernel: PCI: CLS 0 bytes, default 64
Dec  5 06:03:41 np0005546909 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec  5 06:03:41 np0005546909 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec  5 06:03:41 np0005546909 kernel: ACPI: bus type thunderbolt registered
Dec  5 06:03:41 np0005546909 kernel: Trying to unpack rootfs image as initramfs...
Dec  5 06:03:41 np0005546909 kernel: Initialise system trusted keyrings
Dec  5 06:03:41 np0005546909 kernel: Key type blacklist registered
Dec  5 06:03:41 np0005546909 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec  5 06:03:41 np0005546909 kernel: zbud: loaded
Dec  5 06:03:41 np0005546909 kernel: integrity: Platform Keyring initialized
Dec  5 06:03:41 np0005546909 kernel: integrity: Machine keyring initialized
Dec  5 06:03:41 np0005546909 kernel: Freeing initrd memory: 87804K
Dec  5 06:03:41 np0005546909 kernel: NET: Registered PF_ALG protocol family
Dec  5 06:03:41 np0005546909 kernel: xor: automatically using best checksumming function   avx       
Dec  5 06:03:41 np0005546909 kernel: Key type asymmetric registered
Dec  5 06:03:41 np0005546909 kernel: Asymmetric key parser 'x509' registered
Dec  5 06:03:41 np0005546909 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec  5 06:03:41 np0005546909 kernel: io scheduler mq-deadline registered
Dec  5 06:03:41 np0005546909 kernel: io scheduler kyber registered
Dec  5 06:03:41 np0005546909 kernel: io scheduler bfq registered
Dec  5 06:03:41 np0005546909 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec  5 06:03:41 np0005546909 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec  5 06:03:41 np0005546909 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec  5 06:03:41 np0005546909 kernel: ACPI: button: Power Button [PWRF]
Dec  5 06:03:41 np0005546909 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec  5 06:03:41 np0005546909 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec  5 06:03:41 np0005546909 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec  5 06:03:41 np0005546909 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec  5 06:03:41 np0005546909 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec  5 06:03:41 np0005546909 kernel: Non-volatile memory driver v1.3
Dec  5 06:03:41 np0005546909 kernel: rdac: device handler registered
Dec  5 06:03:41 np0005546909 kernel: hp_sw: device handler registered
Dec  5 06:03:41 np0005546909 kernel: emc: device handler registered
Dec  5 06:03:41 np0005546909 kernel: alua: device handler registered
Dec  5 06:03:41 np0005546909 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec  5 06:03:41 np0005546909 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec  5 06:03:41 np0005546909 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec  5 06:03:41 np0005546909 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec  5 06:03:41 np0005546909 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec  5 06:03:41 np0005546909 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec  5 06:03:41 np0005546909 kernel: usb usb1: Product: UHCI Host Controller
Dec  5 06:03:41 np0005546909 kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec  5 06:03:41 np0005546909 kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec  5 06:03:41 np0005546909 kernel: hub 1-0:1.0: USB hub found
Dec  5 06:03:41 np0005546909 kernel: hub 1-0:1.0: 2 ports detected
Dec  5 06:03:41 np0005546909 kernel: usbcore: registered new interface driver usbserial_generic
Dec  5 06:03:41 np0005546909 kernel: usbserial: USB Serial support registered for generic
Dec  5 06:03:41 np0005546909 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec  5 06:03:41 np0005546909 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec  5 06:03:41 np0005546909 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec  5 06:03:41 np0005546909 kernel: mousedev: PS/2 mouse device common for all mice
Dec  5 06:03:41 np0005546909 kernel: rtc_cmos 00:04: RTC can wake from S4
Dec  5 06:03:41 np0005546909 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec  5 06:03:41 np0005546909 kernel: rtc_cmos 00:04: registered as rtc0
Dec  5 06:03:41 np0005546909 kernel: rtc_cmos 00:04: setting system clock to 2025-12-05T11:03:40 UTC (1764932620)
Dec  5 06:03:41 np0005546909 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec  5 06:03:41 np0005546909 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec  5 06:03:41 np0005546909 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec  5 06:03:41 np0005546909 kernel: hid: raw HID events driver (C) Jiri Kosina
Dec  5 06:03:41 np0005546909 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec  5 06:03:41 np0005546909 kernel: usbcore: registered new interface driver usbhid
Dec  5 06:03:41 np0005546909 kernel: usbhid: USB HID core driver
Dec  5 06:03:41 np0005546909 kernel: drop_monitor: Initializing network drop monitor service
Dec  5 06:03:41 np0005546909 kernel: Initializing XFRM netlink socket
Dec  5 06:03:41 np0005546909 kernel: NET: Registered PF_INET6 protocol family
Dec  5 06:03:41 np0005546909 kernel: Segment Routing with IPv6
Dec  5 06:03:41 np0005546909 kernel: NET: Registered PF_PACKET protocol family
Dec  5 06:03:41 np0005546909 kernel: mpls_gso: MPLS GSO support
Dec  5 06:03:41 np0005546909 kernel: IPI shorthand broadcast: enabled
Dec  5 06:03:41 np0005546909 kernel: AVX2 version of gcm_enc/dec engaged.
Dec  5 06:03:41 np0005546909 kernel: AES CTR mode by8 optimization enabled
Dec  5 06:03:41 np0005546909 kernel: sched_clock: Marking stable (1157026929, 143918630)->(1412857209, -111911650)
Dec  5 06:03:41 np0005546909 kernel: registered taskstats version 1
Dec  5 06:03:41 np0005546909 kernel: Loading compiled-in X.509 certificates
Dec  5 06:03:41 np0005546909 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec  5 06:03:41 np0005546909 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec  5 06:03:41 np0005546909 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec  5 06:03:41 np0005546909 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec  5 06:03:41 np0005546909 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec  5 06:03:41 np0005546909 kernel: Demotion targets for Node 0: null
Dec  5 06:03:41 np0005546909 kernel: page_owner is disabled
Dec  5 06:03:41 np0005546909 kernel: Key type .fscrypt registered
Dec  5 06:03:41 np0005546909 kernel: Key type fscrypt-provisioning registered
Dec  5 06:03:41 np0005546909 kernel: Key type big_key registered
Dec  5 06:03:41 np0005546909 kernel: Key type encrypted registered
Dec  5 06:03:41 np0005546909 kernel: ima: No TPM chip found, activating TPM-bypass!
Dec  5 06:03:41 np0005546909 kernel: Loading compiled-in module X.509 certificates
Dec  5 06:03:41 np0005546909 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec  5 06:03:41 np0005546909 kernel: ima: Allocated hash algorithm: sha256
Dec  5 06:03:41 np0005546909 kernel: ima: No architecture policies found
Dec  5 06:03:41 np0005546909 kernel: evm: Initialising EVM extended attributes:
Dec  5 06:03:41 np0005546909 kernel: evm: security.selinux
Dec  5 06:03:41 np0005546909 kernel: evm: security.SMACK64 (disabled)
Dec  5 06:03:41 np0005546909 kernel: evm: security.SMACK64EXEC (disabled)
Dec  5 06:03:41 np0005546909 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec  5 06:03:41 np0005546909 kernel: evm: security.SMACK64MMAP (disabled)
Dec  5 06:03:41 np0005546909 kernel: evm: security.apparmor (disabled)
Dec  5 06:03:41 np0005546909 kernel: evm: security.ima
Dec  5 06:03:41 np0005546909 kernel: evm: security.capability
Dec  5 06:03:41 np0005546909 kernel: evm: HMAC attrs: 0x1
Dec  5 06:03:41 np0005546909 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec  5 06:03:41 np0005546909 kernel: Running certificate verification RSA selftest
Dec  5 06:03:41 np0005546909 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec  5 06:03:41 np0005546909 kernel: Running certificate verification ECDSA selftest
Dec  5 06:03:41 np0005546909 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec  5 06:03:41 np0005546909 kernel: clk: Disabling unused clocks
Dec  5 06:03:41 np0005546909 kernel: Freeing unused decrypted memory: 2028K
Dec  5 06:03:41 np0005546909 kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec  5 06:03:41 np0005546909 kernel: Write protecting the kernel read-only data: 30720k
Dec  5 06:03:41 np0005546909 kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec  5 06:03:41 np0005546909 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec  5 06:03:41 np0005546909 kernel: Run /init as init process
Dec  5 06:03:41 np0005546909 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  5 06:03:41 np0005546909 systemd: Detected virtualization kvm.
Dec  5 06:03:41 np0005546909 systemd: Detected architecture x86-64.
Dec  5 06:03:41 np0005546909 systemd: Running in initrd.
Dec  5 06:03:41 np0005546909 systemd: No hostname configured, using default hostname.
Dec  5 06:03:41 np0005546909 systemd: Hostname set to <localhost>.
Dec  5 06:03:41 np0005546909 systemd: Initializing machine ID from VM UUID.
Dec  5 06:03:41 np0005546909 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec  5 06:03:41 np0005546909 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec  5 06:03:41 np0005546909 kernel: usb 1-1: Product: QEMU USB Tablet
Dec  5 06:03:41 np0005546909 kernel: usb 1-1: Manufacturer: QEMU
Dec  5 06:03:41 np0005546909 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec  5 06:03:41 np0005546909 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec  5 06:03:41 np0005546909 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec  5 06:03:41 np0005546909 systemd: Queued start job for default target Initrd Default Target.
Dec  5 06:03:41 np0005546909 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  5 06:03:41 np0005546909 systemd: Reached target Local Encrypted Volumes.
Dec  5 06:03:41 np0005546909 systemd: Reached target Initrd /usr File System.
Dec  5 06:03:41 np0005546909 systemd: Reached target Local File Systems.
Dec  5 06:03:41 np0005546909 systemd: Reached target Path Units.
Dec  5 06:03:41 np0005546909 systemd: Reached target Slice Units.
Dec  5 06:03:41 np0005546909 systemd: Reached target Swaps.
Dec  5 06:03:41 np0005546909 systemd: Reached target Timer Units.
Dec  5 06:03:41 np0005546909 systemd: Listening on D-Bus System Message Bus Socket.
Dec  5 06:03:41 np0005546909 systemd: Listening on Journal Socket (/dev/log).
Dec  5 06:03:41 np0005546909 systemd: Listening on Journal Socket.
Dec  5 06:03:41 np0005546909 systemd: Listening on udev Control Socket.
Dec  5 06:03:41 np0005546909 systemd: Listening on udev Kernel Socket.
Dec  5 06:03:41 np0005546909 systemd: Reached target Socket Units.
Dec  5 06:03:41 np0005546909 systemd: Starting Create List of Static Device Nodes...
Dec  5 06:03:41 np0005546909 systemd: Starting Journal Service...
Dec  5 06:03:41 np0005546909 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  5 06:03:41 np0005546909 systemd: Starting Apply Kernel Variables...
Dec  5 06:03:41 np0005546909 systemd: Starting Create System Users...
Dec  5 06:03:41 np0005546909 systemd: Starting Setup Virtual Console...
Dec  5 06:03:41 np0005546909 systemd: Finished Create List of Static Device Nodes.
Dec  5 06:03:41 np0005546909 systemd: Finished Apply Kernel Variables.
Dec  5 06:03:41 np0005546909 systemd-journald[307]: Journal started
Dec  5 06:03:41 np0005546909 systemd-journald[307]: Runtime Journal (/run/log/journal/60bd4df1481e4d2395858528ade5c2b1) is 8.0M, max 153.6M, 145.6M free.
Dec  5 06:03:41 np0005546909 systemd-sysusers[311]: Creating group 'users' with GID 100.
Dec  5 06:03:41 np0005546909 systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Dec  5 06:03:41 np0005546909 systemd: Started Journal Service.
Dec  5 06:03:41 np0005546909 systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec  5 06:03:41 np0005546909 systemd[1]: Finished Create System Users.
Dec  5 06:03:41 np0005546909 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  5 06:03:41 np0005546909 systemd[1]: Starting Create Volatile Files and Directories...
Dec  5 06:03:41 np0005546909 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  5 06:03:41 np0005546909 systemd[1]: Finished Setup Virtual Console.
Dec  5 06:03:41 np0005546909 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec  5 06:03:41 np0005546909 systemd[1]: Starting dracut cmdline hook...
Dec  5 06:03:41 np0005546909 systemd[1]: Finished Create Volatile Files and Directories.
Dec  5 06:03:41 np0005546909 dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Dec  5 06:03:41 np0005546909 dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  5 06:03:41 np0005546909 systemd[1]: Finished dracut cmdline hook.
Dec  5 06:03:41 np0005546909 systemd[1]: Starting dracut pre-udev hook...
Dec  5 06:03:41 np0005546909 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec  5 06:03:41 np0005546909 kernel: device-mapper: uevent: version 1.0.3
Dec  5 06:03:41 np0005546909 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec  5 06:03:41 np0005546909 kernel: RPC: Registered named UNIX socket transport module.
Dec  5 06:03:41 np0005546909 kernel: RPC: Registered udp transport module.
Dec  5 06:03:41 np0005546909 kernel: RPC: Registered tcp transport module.
Dec  5 06:03:41 np0005546909 kernel: RPC: Registered tcp-with-tls transport module.
Dec  5 06:03:41 np0005546909 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec  5 06:03:41 np0005546909 rpc.statd[443]: Version 2.5.4 starting
Dec  5 06:03:41 np0005546909 rpc.statd[443]: Initializing NSM state
Dec  5 06:03:41 np0005546909 rpc.idmapd[448]: Setting log level to 0
Dec  5 06:03:41 np0005546909 systemd[1]: Finished dracut pre-udev hook.
Dec  5 06:03:41 np0005546909 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  5 06:03:41 np0005546909 systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Dec  5 06:03:41 np0005546909 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  5 06:03:41 np0005546909 systemd[1]: Starting dracut pre-trigger hook...
Dec  5 06:03:41 np0005546909 systemd[1]: Finished dracut pre-trigger hook.
Dec  5 06:03:41 np0005546909 systemd[1]: Starting Coldplug All udev Devices...
Dec  5 06:03:41 np0005546909 systemd[1]: Created slice Slice /system/modprobe.
Dec  5 06:03:41 np0005546909 systemd[1]: Starting Load Kernel Module configfs...
Dec  5 06:03:41 np0005546909 systemd[1]: Finished Coldplug All udev Devices.
Dec  5 06:03:41 np0005546909 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  5 06:03:41 np0005546909 systemd[1]: Finished Load Kernel Module configfs.
Dec  5 06:03:41 np0005546909 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  5 06:03:41 np0005546909 systemd[1]: Reached target Network.
Dec  5 06:03:41 np0005546909 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  5 06:03:41 np0005546909 systemd[1]: Starting dracut initqueue hook...
Dec  5 06:03:41 np0005546909 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec  5 06:03:41 np0005546909 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec  5 06:03:41 np0005546909 kernel: vda: vda1
Dec  5 06:03:41 np0005546909 systemd-udevd[484]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:03:41 np0005546909 kernel: scsi host0: ata_piix
Dec  5 06:03:41 np0005546909 kernel: scsi host1: ata_piix
Dec  5 06:03:41 np0005546909 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec  5 06:03:41 np0005546909 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec  5 06:03:41 np0005546909 systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec  5 06:03:42 np0005546909 systemd[1]: Reached target Initrd Root Device.
Dec  5 06:03:42 np0005546909 systemd[1]: Mounting Kernel Configuration File System...
Dec  5 06:03:42 np0005546909 systemd[1]: Mounted Kernel Configuration File System.
Dec  5 06:03:42 np0005546909 systemd[1]: Reached target System Initialization.
Dec  5 06:03:42 np0005546909 kernel: ata1: found unknown device (class 0)
Dec  5 06:03:42 np0005546909 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec  5 06:03:42 np0005546909 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec  5 06:03:42 np0005546909 systemd[1]: Reached target Basic System.
Dec  5 06:03:42 np0005546909 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec  5 06:03:42 np0005546909 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec  5 06:03:42 np0005546909 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec  5 06:03:42 np0005546909 systemd[1]: Finished dracut initqueue hook.
Dec  5 06:03:42 np0005546909 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  5 06:03:42 np0005546909 systemd[1]: Reached target Remote Encrypted Volumes.
Dec  5 06:03:42 np0005546909 systemd[1]: Reached target Remote File Systems.
Dec  5 06:03:42 np0005546909 systemd[1]: Starting dracut pre-mount hook...
Dec  5 06:03:42 np0005546909 systemd[1]: Finished dracut pre-mount hook.
Dec  5 06:03:42 np0005546909 systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec  5 06:03:42 np0005546909 systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Dec  5 06:03:42 np0005546909 systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec  5 06:03:42 np0005546909 systemd[1]: Mounting /sysroot...
Dec  5 06:03:42 np0005546909 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec  5 06:03:42 np0005546909 kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec  5 06:03:42 np0005546909 kernel: XFS (vda1): Ending clean mount
Dec  5 06:03:42 np0005546909 systemd[1]: Mounted /sysroot.
Dec  5 06:03:42 np0005546909 systemd[1]: Reached target Initrd Root File System.
Dec  5 06:03:42 np0005546909 systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec  5 06:03:42 np0005546909 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec  5 06:03:42 np0005546909 systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec  5 06:03:42 np0005546909 systemd[1]: Reached target Initrd File Systems.
Dec  5 06:03:42 np0005546909 systemd[1]: Reached target Initrd Default Target.
Dec  5 06:03:42 np0005546909 systemd[1]: Starting dracut mount hook...
Dec  5 06:03:42 np0005546909 systemd[1]: Finished dracut mount hook.
Dec  5 06:03:42 np0005546909 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec  5 06:03:43 np0005546909 rpc.idmapd[448]: exiting on signal 15
Dec  5 06:03:43 np0005546909 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec  5 06:03:43 np0005546909 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Network.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Remote Encrypted Volumes.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Timer Units.
Dec  5 06:03:43 np0005546909 systemd[1]: dbus.socket: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Closed D-Bus System Message Bus Socket.
Dec  5 06:03:43 np0005546909 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Initrd Default Target.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Basic System.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Initrd Root Device.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Initrd /usr File System.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Path Units.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Remote File Systems.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Preparation for Remote File Systems.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Slice Units.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Socket Units.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target System Initialization.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Local File Systems.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Swaps.
Dec  5 06:03:43 np0005546909 systemd[1]: dracut-mount.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped dracut mount hook.
Dec  5 06:03:43 np0005546909 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped dracut pre-mount hook.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped target Local Encrypted Volumes.
Dec  5 06:03:43 np0005546909 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec  5 06:03:43 np0005546909 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped dracut initqueue hook.
Dec  5 06:03:43 np0005546909 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped Apply Kernel Variables.
Dec  5 06:03:43 np0005546909 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped Create Volatile Files and Directories.
Dec  5 06:03:43 np0005546909 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped Coldplug All udev Devices.
Dec  5 06:03:43 np0005546909 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped dracut pre-trigger hook.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec  5 06:03:43 np0005546909 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped Setup Virtual Console.
Dec  5 06:03:43 np0005546909 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec  5 06:03:43 np0005546909 systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec  5 06:03:43 np0005546909 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Closed udev Control Socket.
Dec  5 06:03:43 np0005546909 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Closed udev Kernel Socket.
Dec  5 06:03:43 np0005546909 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped dracut pre-udev hook.
Dec  5 06:03:43 np0005546909 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped dracut cmdline hook.
Dec  5 06:03:43 np0005546909 systemd[1]: Starting Cleanup udev Database...
Dec  5 06:03:43 np0005546909 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec  5 06:03:43 np0005546909 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped Create List of Static Device Nodes.
Dec  5 06:03:43 np0005546909 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Stopped Create System Users.
Dec  5 06:03:43 np0005546909 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Cleanup udev Database.
Dec  5 06:03:43 np0005546909 systemd[1]: Reached target Switch Root.
Dec  5 06:03:43 np0005546909 systemd[1]: Starting Switch Root...
Dec  5 06:03:43 np0005546909 systemd[1]: Switching root.
Dec  5 06:03:43 np0005546909 systemd-journald[307]: Journal stopped
Dec  5 06:03:43 np0005546909 systemd-journald: Received SIGTERM from PID 1 (systemd).
Dec  5 06:03:43 np0005546909 kernel: audit: type=1404 audit(1764932623.289:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec  5 06:03:43 np0005546909 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:03:43 np0005546909 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:03:43 np0005546909 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:03:43 np0005546909 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:03:43 np0005546909 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:03:43 np0005546909 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:03:43 np0005546909 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:03:43 np0005546909 kernel: audit: type=1403 audit(1764932623.410:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec  5 06:03:43 np0005546909 systemd: Successfully loaded SELinux policy in 124.917ms.
Dec  5 06:03:43 np0005546909 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 28.387ms.
Dec  5 06:03:43 np0005546909 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  5 06:03:43 np0005546909 systemd: Detected virtualization kvm.
Dec  5 06:03:43 np0005546909 systemd: Detected architecture x86-64.
Dec  5 06:03:43 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:03:43 np0005546909 systemd: initrd-switch-root.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd: Stopped Switch Root.
Dec  5 06:03:43 np0005546909 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec  5 06:03:43 np0005546909 systemd: Created slice Slice /system/getty.
Dec  5 06:03:43 np0005546909 systemd: Created slice Slice /system/serial-getty.
Dec  5 06:03:43 np0005546909 systemd: Created slice Slice /system/sshd-keygen.
Dec  5 06:03:43 np0005546909 systemd: Created slice User and Session Slice.
Dec  5 06:03:43 np0005546909 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  5 06:03:43 np0005546909 systemd: Started Forward Password Requests to Wall Directory Watch.
Dec  5 06:03:43 np0005546909 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec  5 06:03:43 np0005546909 systemd: Reached target Local Encrypted Volumes.
Dec  5 06:03:43 np0005546909 systemd: Stopped target Switch Root.
Dec  5 06:03:43 np0005546909 systemd: Stopped target Initrd File Systems.
Dec  5 06:03:43 np0005546909 systemd: Stopped target Initrd Root File System.
Dec  5 06:03:43 np0005546909 systemd: Reached target Local Integrity Protected Volumes.
Dec  5 06:03:43 np0005546909 systemd: Reached target Path Units.
Dec  5 06:03:43 np0005546909 systemd: Reached target rpc_pipefs.target.
Dec  5 06:03:43 np0005546909 systemd: Reached target Slice Units.
Dec  5 06:03:43 np0005546909 systemd: Reached target Swaps.
Dec  5 06:03:43 np0005546909 systemd: Reached target Local Verity Protected Volumes.
Dec  5 06:03:43 np0005546909 systemd: Listening on RPCbind Server Activation Socket.
Dec  5 06:03:43 np0005546909 systemd: Reached target RPC Port Mapper.
Dec  5 06:03:43 np0005546909 systemd: Listening on Process Core Dump Socket.
Dec  5 06:03:43 np0005546909 systemd: Listening on initctl Compatibility Named Pipe.
Dec  5 06:03:43 np0005546909 systemd: Listening on udev Control Socket.
Dec  5 06:03:43 np0005546909 systemd: Listening on udev Kernel Socket.
Dec  5 06:03:43 np0005546909 systemd: Mounting Huge Pages File System...
Dec  5 06:03:43 np0005546909 systemd: Mounting POSIX Message Queue File System...
Dec  5 06:03:43 np0005546909 systemd: Mounting Kernel Debug File System...
Dec  5 06:03:43 np0005546909 systemd: Mounting Kernel Trace File System...
Dec  5 06:03:43 np0005546909 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  5 06:03:43 np0005546909 systemd: Starting Create List of Static Device Nodes...
Dec  5 06:03:43 np0005546909 systemd: Starting Load Kernel Module configfs...
Dec  5 06:03:43 np0005546909 systemd: Starting Load Kernel Module drm...
Dec  5 06:03:43 np0005546909 systemd: Starting Load Kernel Module efi_pstore...
Dec  5 06:03:43 np0005546909 systemd: Starting Load Kernel Module fuse...
Dec  5 06:03:43 np0005546909 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec  5 06:03:43 np0005546909 systemd: systemd-fsck-root.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd: Stopped File System Check on Root Device.
Dec  5 06:03:43 np0005546909 systemd: Stopped Journal Service.
Dec  5 06:03:43 np0005546909 systemd: Starting Journal Service...
Dec  5 06:03:43 np0005546909 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  5 06:03:43 np0005546909 systemd: Starting Generate network units from Kernel command line...
Dec  5 06:03:43 np0005546909 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  5 06:03:43 np0005546909 systemd: Starting Remount Root and Kernel File Systems...
Dec  5 06:03:43 np0005546909 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec  5 06:03:43 np0005546909 systemd: Starting Apply Kernel Variables...
Dec  5 06:03:43 np0005546909 kernel: fuse: init (API version 7.37)
Dec  5 06:03:43 np0005546909 systemd: Starting Coldplug All udev Devices...
Dec  5 06:03:43 np0005546909 systemd: Mounted Huge Pages File System.
Dec  5 06:03:43 np0005546909 systemd: Mounted POSIX Message Queue File System.
Dec  5 06:03:43 np0005546909 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec  5 06:03:43 np0005546909 systemd-journald[677]: Journal started
Dec  5 06:03:43 np0005546909 systemd-journald[677]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec  5 06:03:43 np0005546909 systemd[1]: Queued start job for default target Multi-User System.
Dec  5 06:03:43 np0005546909 systemd[1]: systemd-journald.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd: Started Journal Service.
Dec  5 06:03:43 np0005546909 systemd[1]: Mounted Kernel Debug File System.
Dec  5 06:03:43 np0005546909 systemd[1]: Mounted Kernel Trace File System.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Create List of Static Device Nodes.
Dec  5 06:03:43 np0005546909 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Load Kernel Module configfs.
Dec  5 06:03:43 np0005546909 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Load Kernel Module efi_pstore.
Dec  5 06:03:43 np0005546909 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Load Kernel Module fuse.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Generate network units from Kernel command line.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Remount Root and Kernel File Systems.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Apply Kernel Variables.
Dec  5 06:03:43 np0005546909 kernel: ACPI: bus type drm_connector registered
Dec  5 06:03:43 np0005546909 systemd[1]: Mounting FUSE Control File System...
Dec  5 06:03:43 np0005546909 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  5 06:03:43 np0005546909 systemd[1]: Starting Rebuild Hardware Database...
Dec  5 06:03:43 np0005546909 systemd[1]: Starting Flush Journal to Persistent Storage...
Dec  5 06:03:43 np0005546909 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec  5 06:03:43 np0005546909 systemd[1]: Starting Load/Save OS Random Seed...
Dec  5 06:03:43 np0005546909 systemd[1]: Starting Create System Users...
Dec  5 06:03:43 np0005546909 systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Load Kernel Module drm.
Dec  5 06:03:43 np0005546909 systemd[1]: Mounted FUSE Control File System.
Dec  5 06:03:43 np0005546909 systemd-journald[677]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec  5 06:03:43 np0005546909 systemd-journald[677]: Received client request to flush runtime journal.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Coldplug All udev Devices.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Flush Journal to Persistent Storage.
Dec  5 06:03:43 np0005546909 systemd[1]: Finished Load/Save OS Random Seed.
Dec  5 06:03:43 np0005546909 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  5 06:03:44 np0005546909 systemd[1]: Finished Create System Users.
Dec  5 06:03:44 np0005546909 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  5 06:03:44 np0005546909 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  5 06:03:44 np0005546909 systemd[1]: Reached target Preparation for Local File Systems.
Dec  5 06:03:44 np0005546909 systemd[1]: Reached target Local File Systems.
Dec  5 06:03:44 np0005546909 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec  5 06:03:44 np0005546909 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec  5 06:03:44 np0005546909 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec  5 06:03:44 np0005546909 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec  5 06:03:44 np0005546909 systemd[1]: Starting Automatic Boot Loader Update...
Dec  5 06:03:44 np0005546909 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec  5 06:03:44 np0005546909 systemd[1]: Starting Create Volatile Files and Directories...
Dec  5 06:03:44 np0005546909 bootctl[693]: Couldn't find EFI system partition, skipping.
Dec  5 06:03:44 np0005546909 systemd[1]: Finished Automatic Boot Loader Update.
Dec  5 06:03:44 np0005546909 systemd[1]: Finished Create Volatile Files and Directories.
Dec  5 06:03:44 np0005546909 systemd[1]: Starting Security Auditing Service...
Dec  5 06:03:44 np0005546909 systemd[1]: Starting RPC Bind...
Dec  5 06:03:44 np0005546909 systemd[1]: Starting Rebuild Journal Catalog...
Dec  5 06:03:44 np0005546909 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec  5 06:03:44 np0005546909 auditd[699]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec  5 06:03:44 np0005546909 auditd[699]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec  5 06:03:44 np0005546909 systemd[1]: Started RPC Bind.
Dec  5 06:03:44 np0005546909 augenrules[704]: /sbin/augenrules: No change
Dec  5 06:03:44 np0005546909 augenrules[719]: No rules
Dec  5 06:03:44 np0005546909 augenrules[719]: enabled 1
Dec  5 06:03:44 np0005546909 augenrules[719]: failure 1
Dec  5 06:03:44 np0005546909 augenrules[719]: pid 699
Dec  5 06:03:44 np0005546909 augenrules[719]: rate_limit 0
Dec  5 06:03:44 np0005546909 augenrules[719]: backlog_limit 8192
Dec  5 06:03:44 np0005546909 augenrules[719]: lost 0
Dec  5 06:03:44 np0005546909 augenrules[719]: backlog 3
Dec  5 06:03:44 np0005546909 augenrules[719]: backlog_wait_time 60000
Dec  5 06:03:44 np0005546909 augenrules[719]: backlog_wait_time_actual 0
Dec  5 06:03:44 np0005546909 augenrules[719]: enabled 1
Dec  5 06:03:44 np0005546909 augenrules[719]: failure 1
Dec  5 06:03:44 np0005546909 augenrules[719]: pid 699
Dec  5 06:03:44 np0005546909 augenrules[719]: rate_limit 0
Dec  5 06:03:44 np0005546909 augenrules[719]: backlog_limit 8192
Dec  5 06:03:44 np0005546909 augenrules[719]: lost 0
Dec  5 06:03:44 np0005546909 augenrules[719]: backlog 4
Dec  5 06:03:44 np0005546909 augenrules[719]: backlog_wait_time 60000
Dec  5 06:03:44 np0005546909 augenrules[719]: backlog_wait_time_actual 0
Dec  5 06:03:44 np0005546909 augenrules[719]: enabled 1
Dec  5 06:03:44 np0005546909 augenrules[719]: failure 1
Dec  5 06:03:44 np0005546909 augenrules[719]: pid 699
Dec  5 06:03:44 np0005546909 augenrules[719]: rate_limit 0
Dec  5 06:03:44 np0005546909 augenrules[719]: backlog_limit 8192
Dec  5 06:03:44 np0005546909 augenrules[719]: lost 0
Dec  5 06:03:44 np0005546909 augenrules[719]: backlog 4
Dec  5 06:03:44 np0005546909 augenrules[719]: backlog_wait_time 60000
Dec  5 06:03:44 np0005546909 augenrules[719]: backlog_wait_time_actual 0
Dec  5 06:03:44 np0005546909 systemd[1]: Finished Rebuild Journal Catalog.
Dec  5 06:03:44 np0005546909 systemd[1]: Started Security Auditing Service.
Dec  5 06:03:44 np0005546909 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec  5 06:03:44 np0005546909 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec  5 06:03:44 np0005546909 systemd[1]: Finished Rebuild Hardware Database.
Dec  5 06:03:44 np0005546909 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  5 06:03:44 np0005546909 systemd[1]: Starting Update is Completed...
Dec  5 06:03:44 np0005546909 systemd[1]: Finished Update is Completed.
Dec  5 06:03:44 np0005546909 systemd-udevd[727]: Using default interface naming scheme 'rhel-9.0'.
Dec  5 06:03:44 np0005546909 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  5 06:03:44 np0005546909 systemd[1]: Reached target System Initialization.
Dec  5 06:03:44 np0005546909 systemd[1]: Started dnf makecache --timer.
Dec  5 06:03:44 np0005546909 systemd[1]: Started Daily rotation of log files.
Dec  5 06:03:44 np0005546909 systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec  5 06:03:44 np0005546909 systemd[1]: Reached target Timer Units.
Dec  5 06:03:44 np0005546909 systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec  5 06:03:44 np0005546909 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec  5 06:03:44 np0005546909 systemd[1]: Reached target Socket Units.
Dec  5 06:03:44 np0005546909 systemd[1]: Starting D-Bus System Message Bus...
Dec  5 06:03:44 np0005546909 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  5 06:03:44 np0005546909 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec  5 06:03:44 np0005546909 systemd[1]: Starting Load Kernel Module configfs...
Dec  5 06:03:44 np0005546909 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  5 06:03:44 np0005546909 systemd[1]: Finished Load Kernel Module configfs.
Dec  5 06:03:44 np0005546909 systemd-udevd[744]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:03:44 np0005546909 systemd[1]: Started D-Bus System Message Bus.
Dec  5 06:03:44 np0005546909 systemd[1]: Reached target Basic System.
Dec  5 06:03:44 np0005546909 dbus-broker-lau[760]: Ready
Dec  5 06:03:44 np0005546909 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec  5 06:03:44 np0005546909 systemd[1]: Starting NTP client/server...
Dec  5 06:03:44 np0005546909 chronyd[781]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  5 06:03:44 np0005546909 chronyd[781]: Loaded 0 symmetric keys
Dec  5 06:03:44 np0005546909 chronyd[781]: Using right/UTC timezone to obtain leap second data
Dec  5 06:03:44 np0005546909 chronyd[781]: Loaded seccomp filter (level 2)
Dec  5 06:03:44 np0005546909 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec  5 06:03:44 np0005546909 systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec  5 06:03:44 np0005546909 systemd[1]: Starting IPv4 firewall with iptables...
Dec  5 06:03:44 np0005546909 systemd[1]: Started irqbalance daemon.
Dec  5 06:03:44 np0005546909 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec  5 06:03:44 np0005546909 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  5 06:03:44 np0005546909 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  5 06:03:44 np0005546909 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  5 06:03:44 np0005546909 systemd[1]: Reached target sshd-keygen.target.
Dec  5 06:03:44 np0005546909 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec  5 06:03:44 np0005546909 systemd[1]: Reached target User and Group Name Lookups.
Dec  5 06:03:44 np0005546909 systemd[1]: Starting User Login Management...
Dec  5 06:03:44 np0005546909 systemd[1]: Started NTP client/server.
Dec  5 06:03:44 np0005546909 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec  5 06:03:44 np0005546909 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec  5 06:03:44 np0005546909 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec  5 06:03:44 np0005546909 systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec  5 06:03:44 np0005546909 kernel: kvm_amd: TSC scaling supported
Dec  5 06:03:44 np0005546909 kernel: kvm_amd: Nested Virtualization enabled
Dec  5 06:03:44 np0005546909 kernel: kvm_amd: Nested Paging enabled
Dec  5 06:03:44 np0005546909 kernel: kvm_amd: LBR virtualization supported
Dec  5 06:03:44 np0005546909 systemd-logind[792]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  5 06:03:44 np0005546909 systemd-logind[792]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  5 06:03:44 np0005546909 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec  5 06:03:44 np0005546909 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec  5 06:03:44 np0005546909 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec  5 06:03:44 np0005546909 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec  5 06:03:44 np0005546909 systemd-logind[792]: New seat seat0.
Dec  5 06:03:44 np0005546909 systemd[1]: Started User Login Management.
Dec  5 06:03:44 np0005546909 kernel: Console: switching to colour dummy device 80x25
Dec  5 06:03:44 np0005546909 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec  5 06:03:44 np0005546909 kernel: [drm] features: -context_init
Dec  5 06:03:44 np0005546909 kernel: [drm] number of scanouts: 1
Dec  5 06:03:44 np0005546909 kernel: [drm] number of cap sets: 0
Dec  5 06:03:44 np0005546909 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec  5 06:03:44 np0005546909 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec  5 06:03:44 np0005546909 kernel: Console: switching to colour frame buffer device 128x48
Dec  5 06:03:44 np0005546909 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec  5 06:03:45 np0005546909 iptables.init[785]: iptables: Applying firewall rules: [  OK  ]
Dec  5 06:03:45 np0005546909 systemd[1]: Finished IPv4 firewall with iptables.
Dec  5 06:03:45 np0005546909 cloud-init[837]: Cloud-init v. 24.4-7.el9 running 'init-local' at Fri, 05 Dec 2025 11:03:45 +0000. Up 5.91 seconds.
Dec  5 06:03:45 np0005546909 systemd[1]: run-cloud\x2dinit-tmp-tmp5q2ph1a4.mount: Deactivated successfully.
Dec  5 06:03:45 np0005546909 systemd[1]: Starting Hostname Service...
Dec  5 06:03:45 np0005546909 systemd[1]: Started Hostname Service.
Dec  5 06:03:45 np0005546909 systemd-hostnamed[851]: Hostname set to <np0005546909.novalocal> (static)
Dec  5 06:03:45 np0005546909 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec  5 06:03:45 np0005546909 systemd[1]: Reached target Preparation for Network.
Dec  5 06:03:45 np0005546909 systemd[1]: Starting Network Manager...
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8452] NetworkManager (version 1.54.1-1.el9) is starting... (boot:f0f69436-bbfa-48e7-b73e-9b22f091bec6)
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8458] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8538] manager[0x5568d3ad5080]: monitoring kernel firmware directory '/lib/firmware'.
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8575] hostname: hostname: using hostnamed
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8576] hostname: static hostname changed from (none) to "np0005546909.novalocal"
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8582] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8692] manager[0x5568d3ad5080]: rfkill: Wi-Fi hardware radio set enabled
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8693] manager[0x5568d3ad5080]: rfkill: WWAN hardware radio set enabled
Dec  5 06:03:45 np0005546909 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8742] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8744] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8745] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8745] manager: Networking is enabled by state file
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8748] settings: Loaded settings plugin: keyfile (internal)
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8766] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8789] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8804] dhcp: init: Using DHCP client 'internal'
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8806] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8820] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8828] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8836] device (lo): Activation: starting connection 'lo' (0f41f2b7-1648-4484-8c07-96c2526b8b00)
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8845] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8849] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8881] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8886] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8889] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8891] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8894] device (eth0): carrier: link connected
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8897] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8903] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8909] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8914] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8915] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8917] manager: NetworkManager state is now CONNECTING
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8918] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:03:45 np0005546909 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8925] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.8928] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:03:45 np0005546909 systemd[1]: Started Network Manager.
Dec  5 06:03:45 np0005546909 systemd[1]: Reached target Network.
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.9124] dhcp4 (eth0): state changed new lease, address=38.102.83.241
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.9134] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  5 06:03:45 np0005546909 systemd[1]: Starting Network Manager Wait Online...
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.9159] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:03:45 np0005546909 systemd[1]: Starting GSSAPI Proxy Daemon...
Dec  5 06:03:45 np0005546909 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.9262] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.9264] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.9272] device (lo): Activation: successful, device activated.
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.9280] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.9281] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.9285] manager: NetworkManager state is now CONNECTED_SITE
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.9289] device (eth0): Activation: successful, device activated.
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.9298] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  5 06:03:45 np0005546909 NetworkManager[855]: <info>  [1764932625.9301] manager: startup complete
Dec  5 06:03:45 np0005546909 systemd[1]: Started GSSAPI Proxy Daemon.
Dec  5 06:03:45 np0005546909 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  5 06:03:45 np0005546909 systemd[1]: Reached target NFS client services.
Dec  5 06:03:45 np0005546909 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  5 06:03:45 np0005546909 systemd[1]: Reached target Remote File Systems.
Dec  5 06:03:45 np0005546909 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  5 06:03:45 np0005546909 systemd[1]: Finished Network Manager Wait Online.
Dec  5 06:03:45 np0005546909 systemd[1]: Starting Cloud-init: Network Stage...
Dec  5 06:03:46 np0005546909 cloud-init[920]: Cloud-init v. 24.4-7.el9 running 'init' at Fri, 05 Dec 2025 11:03:46 +0000. Up 6.87 seconds.
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: |  eth0  | True |        38.102.83.241         | 255.255.255.0 | global | fa:16:3e:00:6c:52 |
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:fe00:6c52/64 |       .       |  link  | fa:16:3e:00:6c:52 |
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec  5 06:03:46 np0005546909 cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  5 06:03:47 np0005546909 cloud-init[920]: Generating public/private rsa key pair.
Dec  5 06:03:47 np0005546909 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec  5 06:03:47 np0005546909 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec  5 06:03:47 np0005546909 cloud-init[920]: The key fingerprint is:
Dec  5 06:03:47 np0005546909 cloud-init[920]: SHA256:wZRtmTk4xsuEHAFXH1CEZ7A5hnT5jJOf/JEm3f1kVic root@np0005546909.novalocal
Dec  5 06:03:47 np0005546909 cloud-init[920]: The key's randomart image is:
Dec  5 06:03:47 np0005546909 cloud-init[920]: +---[RSA 3072]----+
Dec  5 06:03:47 np0005546909 cloud-init[920]: |    .o=**@++     |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |     oo=X+X.     |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |      .+B@..     |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |       .*oo   E o|
Dec  5 06:03:47 np0005546909 cloud-init[920]: |        S+ o o oo|
Dec  5 06:03:47 np0005546909 cloud-init[920]: |          = = . =|
Dec  5 06:03:47 np0005546909 cloud-init[920]: |           + . +.|
Dec  5 06:03:47 np0005546909 cloud-init[920]: |            .   .|
Dec  5 06:03:47 np0005546909 cloud-init[920]: |                 |
Dec  5 06:03:47 np0005546909 cloud-init[920]: +----[SHA256]-----+
Dec  5 06:03:47 np0005546909 cloud-init[920]: Generating public/private ecdsa key pair.
Dec  5 06:03:47 np0005546909 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec  5 06:03:47 np0005546909 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec  5 06:03:47 np0005546909 cloud-init[920]: The key fingerprint is:
Dec  5 06:03:47 np0005546909 cloud-init[920]: SHA256:1IoFMFejFR73nAw9wcEDdArEGSpq08/uI1F9/3/Yz88 root@np0005546909.novalocal
Dec  5 06:03:47 np0005546909 cloud-init[920]: The key's randomart image is:
Dec  5 06:03:47 np0005546909 cloud-init[920]: +---[ECDSA 256]---+
Dec  5 06:03:47 np0005546909 cloud-init[920]: |    o.o+X===+o   |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |     o *o* B*.   |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |    . o.+ o =o   |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |   o ..+...      |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |  + ... S. .     |
Dec  5 06:03:47 np0005546909 cloud-init[920]: | . ..o      .    |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |     .o      . o |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |    ...       ooo|
Dec  5 06:03:47 np0005546909 cloud-init[920]: |     oo.       .E|
Dec  5 06:03:47 np0005546909 cloud-init[920]: +----[SHA256]-----+
Dec  5 06:03:47 np0005546909 cloud-init[920]: Generating public/private ed25519 key pair.
Dec  5 06:03:47 np0005546909 cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec  5 06:03:47 np0005546909 cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec  5 06:03:47 np0005546909 cloud-init[920]: The key fingerprint is:
Dec  5 06:03:47 np0005546909 cloud-init[920]: SHA256:g7DiXQvFLA7MQpifJ0DQGua+AsTYSz/Yk0wN3orkgv4 root@np0005546909.novalocal
Dec  5 06:03:47 np0005546909 cloud-init[920]: The key's randomart image is:
Dec  5 06:03:47 np0005546909 cloud-init[920]: +--[ED25519 256]--+
Dec  5 06:03:47 np0005546909 cloud-init[920]: |==               |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |*+. .o           |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |**+oo++          |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |o=Bo+=o.         |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |++.@+oo S        |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |+o=oOo . .       |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |o.o .o.          |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |.o               |
Dec  5 06:03:47 np0005546909 cloud-init[920]: |. .E             |
Dec  5 06:03:47 np0005546909 cloud-init[920]: +----[SHA256]-----+
Dec  5 06:03:47 np0005546909 systemd[1]: Finished Cloud-init: Network Stage.
Dec  5 06:03:47 np0005546909 systemd[1]: Reached target Cloud-config availability.
Dec  5 06:03:47 np0005546909 systemd[1]: Reached target Network is Online.
Dec  5 06:03:47 np0005546909 systemd[1]: Starting Cloud-init: Config Stage...
Dec  5 06:03:47 np0005546909 systemd[1]: Starting Crash recovery kernel arming...
Dec  5 06:03:47 np0005546909 systemd[1]: Starting Notify NFS peers of a restart...
Dec  5 06:03:47 np0005546909 systemd[1]: Starting System Logging Service...
Dec  5 06:03:47 np0005546909 sm-notify[1003]: Version 2.5.4 starting
Dec  5 06:03:47 np0005546909 systemd[1]: Starting OpenSSH server daemon...
Dec  5 06:03:47 np0005546909 systemd[1]: Starting Permit User Sessions...
Dec  5 06:03:47 np0005546909 systemd[1]: Started Notify NFS peers of a restart.
Dec  5 06:03:47 np0005546909 systemd[1]: Started OpenSSH server daemon.
Dec  5 06:03:47 np0005546909 systemd[1]: Finished Permit User Sessions.
Dec  5 06:03:47 np0005546909 systemd[1]: Started Command Scheduler.
Dec  5 06:03:47 np0005546909 systemd[1]: Started Getty on tty1.
Dec  5 06:03:47 np0005546909 systemd[1]: Started Serial Getty on ttyS0.
Dec  5 06:03:47 np0005546909 systemd[1]: Reached target Login Prompts.
Dec  5 06:03:47 np0005546909 rsyslogd[1004]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1004" x-info="https://www.rsyslog.com"] start
Dec  5 06:03:47 np0005546909 systemd[1]: Started System Logging Service.
Dec  5 06:03:47 np0005546909 rsyslogd[1004]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec  5 06:03:47 np0005546909 systemd[1]: Reached target Multi-User System.
Dec  5 06:03:47 np0005546909 systemd[1]: Starting Record Runlevel Change in UTMP...
Dec  5 06:03:47 np0005546909 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec  5 06:03:47 np0005546909 systemd[1]: Finished Record Runlevel Change in UTMP.
Dec  5 06:03:47 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 06:03:47 np0005546909 kdumpctl[1014]: kdump: No kdump initial ramdisk found.
Dec  5 06:03:47 np0005546909 kdumpctl[1014]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec  5 06:03:47 np0005546909 cloud-init[1159]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Fri, 05 Dec 2025 11:03:47 +0000. Up 8.53 seconds.
Dec  5 06:03:48 np0005546909 systemd[1]: Finished Cloud-init: Config Stage.
Dec  5 06:03:48 np0005546909 systemd[1]: Starting Cloud-init: Final Stage...
Dec  5 06:03:48 np0005546909 dracut[1265]: dracut-057-102.git20250818.el9
Dec  5 06:03:48 np0005546909 dracut[1267]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec  5 06:03:48 np0005546909 cloud-init[1295]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Fri, 05 Dec 2025 11:03:48 +0000. Up 8.92 seconds.
Dec  5 06:03:48 np0005546909 cloud-init[1330]: #############################################################
Dec  5 06:03:48 np0005546909 cloud-init[1332]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec  5 06:03:48 np0005546909 cloud-init[1340]: 256 SHA256:1IoFMFejFR73nAw9wcEDdArEGSpq08/uI1F9/3/Yz88 root@np0005546909.novalocal (ECDSA)
Dec  5 06:03:48 np0005546909 cloud-init[1342]: 256 SHA256:g7DiXQvFLA7MQpifJ0DQGua+AsTYSz/Yk0wN3orkgv4 root@np0005546909.novalocal (ED25519)
Dec  5 06:03:48 np0005546909 cloud-init[1344]: 3072 SHA256:wZRtmTk4xsuEHAFXH1CEZ7A5hnT5jJOf/JEm3f1kVic root@np0005546909.novalocal (RSA)
Dec  5 06:03:48 np0005546909 cloud-init[1345]: -----END SSH HOST KEY FINGERPRINTS-----
Dec  5 06:03:48 np0005546909 cloud-init[1346]: #############################################################
Dec  5 06:03:48 np0005546909 cloud-init[1295]: Cloud-init v. 24.4-7.el9 finished at Fri, 05 Dec 2025 11:03:48 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.11 seconds
Dec  5 06:03:48 np0005546909 systemd[1]: Finished Cloud-init: Final Stage.
Dec  5 06:03:48 np0005546909 systemd[1]: Reached target Cloud-init target.
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  5 06:03:48 np0005546909 dracut[1267]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: memstrack is not available
Dec  5 06:03:49 np0005546909 dracut[1267]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  5 06:03:49 np0005546909 dracut[1267]: memstrack is not available
Dec  5 06:03:49 np0005546909 dracut[1267]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  5 06:03:49 np0005546909 dracut[1267]: *** Including module: systemd ***
Dec  5 06:03:50 np0005546909 dracut[1267]: *** Including module: fips ***
Dec  5 06:03:50 np0005546909 dracut[1267]: *** Including module: systemd-initrd ***
Dec  5 06:03:50 np0005546909 dracut[1267]: *** Including module: i18n ***
Dec  5 06:03:50 np0005546909 dracut[1267]: *** Including module: drm ***
Dec  5 06:03:50 np0005546909 chronyd[781]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Dec  5 06:03:50 np0005546909 chronyd[781]: System clock TAI offset set to 37 seconds
Dec  5 06:03:51 np0005546909 dracut[1267]: *** Including module: prefixdevname ***
Dec  5 06:03:51 np0005546909 dracut[1267]: *** Including module: kernel-modules ***
Dec  5 06:03:51 np0005546909 kernel: block vda: the capability attribute has been deprecated.
Dec  5 06:03:51 np0005546909 dracut[1267]: *** Including module: kernel-modules-extra ***
Dec  5 06:03:51 np0005546909 dracut[1267]: *** Including module: qemu ***
Dec  5 06:03:51 np0005546909 dracut[1267]: *** Including module: fstab-sys ***
Dec  5 06:03:51 np0005546909 dracut[1267]: *** Including module: rootfs-block ***
Dec  5 06:03:51 np0005546909 dracut[1267]: *** Including module: terminfo ***
Dec  5 06:03:51 np0005546909 dracut[1267]: *** Including module: udev-rules ***
Dec  5 06:03:52 np0005546909 dracut[1267]: Skipping udev rule: 91-permissions.rules
Dec  5 06:03:52 np0005546909 dracut[1267]: Skipping udev rule: 80-drivers-modprobe.rules
Dec  5 06:03:52 np0005546909 dracut[1267]: *** Including module: virtiofs ***
Dec  5 06:03:52 np0005546909 dracut[1267]: *** Including module: dracut-systemd ***
Dec  5 06:03:52 np0005546909 dracut[1267]: *** Including module: usrmount ***
Dec  5 06:03:52 np0005546909 dracut[1267]: *** Including module: base ***
Dec  5 06:03:52 np0005546909 dracut[1267]: *** Including module: fs-lib ***
Dec  5 06:03:52 np0005546909 dracut[1267]: *** Including module: kdumpbase ***
Dec  5 06:03:52 np0005546909 dracut[1267]: *** Including module: microcode_ctl-fw_dir_override ***
Dec  5 06:03:52 np0005546909 dracut[1267]:  microcode_ctl module: mangling fw_dir
Dec  5 06:03:52 np0005546909 dracut[1267]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec  5 06:03:52 np0005546909 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec  5 06:03:52 np0005546909 dracut[1267]:    microcode_ctl: configuration "intel" is ignored
Dec  5 06:03:52 np0005546909 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec  5 06:03:53 np0005546909 dracut[1267]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec  5 06:03:53 np0005546909 dracut[1267]: *** Including module: openssl ***
Dec  5 06:03:53 np0005546909 dracut[1267]: *** Including module: shutdown ***
Dec  5 06:03:53 np0005546909 dracut[1267]: *** Including module: squash ***
Dec  5 06:03:53 np0005546909 dracut[1267]: *** Including modules done ***
Dec  5 06:03:53 np0005546909 dracut[1267]: *** Installing kernel module dependencies ***
Dec  5 06:03:54 np0005546909 dracut[1267]: *** Installing kernel module dependencies done ***
Dec  5 06:03:54 np0005546909 dracut[1267]: *** Resolving executable dependencies ***
Dec  5 06:03:55 np0005546909 irqbalance[790]: Cannot change IRQ 25 affinity: Operation not permitted
Dec  5 06:03:55 np0005546909 irqbalance[790]: IRQ 25 affinity is now unmanaged
Dec  5 06:03:55 np0005546909 irqbalance[790]: Cannot change IRQ 31 affinity: Operation not permitted
Dec  5 06:03:55 np0005546909 irqbalance[790]: IRQ 31 affinity is now unmanaged
Dec  5 06:03:55 np0005546909 irqbalance[790]: Cannot change IRQ 28 affinity: Operation not permitted
Dec  5 06:03:55 np0005546909 irqbalance[790]: IRQ 28 affinity is now unmanaged
Dec  5 06:03:55 np0005546909 irqbalance[790]: Cannot change IRQ 32 affinity: Operation not permitted
Dec  5 06:03:55 np0005546909 irqbalance[790]: IRQ 32 affinity is now unmanaged
Dec  5 06:03:55 np0005546909 irqbalance[790]: Cannot change IRQ 30 affinity: Operation not permitted
Dec  5 06:03:55 np0005546909 irqbalance[790]: IRQ 30 affinity is now unmanaged
Dec  5 06:03:55 np0005546909 irqbalance[790]: Cannot change IRQ 29 affinity: Operation not permitted
Dec  5 06:03:55 np0005546909 irqbalance[790]: IRQ 29 affinity is now unmanaged
Dec  5 06:03:56 np0005546909 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  5 06:03:56 np0005546909 dracut[1267]: *** Resolving executable dependencies done ***
Dec  5 06:03:56 np0005546909 dracut[1267]: *** Generating early-microcode cpio image ***
Dec  5 06:03:56 np0005546909 dracut[1267]: *** Store current command line parameters ***
Dec  5 06:03:56 np0005546909 dracut[1267]: Stored kernel commandline:
Dec  5 06:03:56 np0005546909 dracut[1267]: No dracut internal kernel commandline stored in the initramfs
Dec  5 06:03:56 np0005546909 dracut[1267]: *** Install squash loader ***
Dec  5 06:03:57 np0005546909 dracut[1267]: *** Squashing the files inside the initramfs ***
Dec  5 06:03:58 np0005546909 dracut[1267]: *** Squashing the files inside the initramfs done ***
Dec  5 06:03:58 np0005546909 dracut[1267]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec  5 06:03:58 np0005546909 dracut[1267]: *** Hardlinking files ***
Dec  5 06:03:58 np0005546909 dracut[1267]: *** Hardlinking files done ***
Dec  5 06:03:58 np0005546909 dracut[1267]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec  5 06:03:59 np0005546909 kdumpctl[1014]: kdump: kexec: loaded kdump kernel
Dec  5 06:03:59 np0005546909 kdumpctl[1014]: kdump: Starting kdump: [OK]
Dec  5 06:03:59 np0005546909 systemd[1]: Finished Crash recovery kernel arming.
Dec  5 06:03:59 np0005546909 systemd[1]: Startup finished in 1.484s (kernel) + 2.423s (initrd) + 16.074s (userspace) = 19.982s.
Dec  5 06:04:05 np0005546909 systemd[1]: Created slice User Slice of UID 1000.
Dec  5 06:04:06 np0005546909 systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec  5 06:04:06 np0005546909 systemd-logind[792]: New session 1 of user zuul.
Dec  5 06:04:06 np0005546909 systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec  5 06:04:06 np0005546909 systemd[1]: Starting User Manager for UID 1000...
Dec  5 06:04:06 np0005546909 systemd[4300]: Queued start job for default target Main User Target.
Dec  5 06:04:06 np0005546909 systemd[4300]: Created slice User Application Slice.
Dec  5 06:04:06 np0005546909 systemd[4300]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  5 06:04:06 np0005546909 systemd[4300]: Started Daily Cleanup of User's Temporary Directories.
Dec  5 06:04:06 np0005546909 systemd[4300]: Reached target Paths.
Dec  5 06:04:06 np0005546909 systemd[4300]: Reached target Timers.
Dec  5 06:04:06 np0005546909 systemd[4300]: Starting D-Bus User Message Bus Socket...
Dec  5 06:04:06 np0005546909 systemd[4300]: Starting Create User's Volatile Files and Directories...
Dec  5 06:04:06 np0005546909 systemd[4300]: Finished Create User's Volatile Files and Directories.
Dec  5 06:04:06 np0005546909 systemd[4300]: Listening on D-Bus User Message Bus Socket.
Dec  5 06:04:06 np0005546909 systemd[4300]: Reached target Sockets.
Dec  5 06:04:06 np0005546909 systemd[4300]: Reached target Basic System.
Dec  5 06:04:06 np0005546909 systemd[4300]: Reached target Main User Target.
Dec  5 06:04:06 np0005546909 systemd[4300]: Startup finished in 136ms.
Dec  5 06:04:06 np0005546909 systemd[1]: Started User Manager for UID 1000.
Dec  5 06:04:06 np0005546909 systemd[1]: Started Session 1 of User zuul.
Dec  5 06:04:06 np0005546909 python3[4382]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:04:08 np0005546909 python3[4410]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:04:14 np0005546909 python3[4470]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:04:15 np0005546909 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  5 06:04:16 np0005546909 python3[4512]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec  5 06:04:18 np0005546909 python3[4538]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCipPqWEhKyX4DzIESCTQ5zeFvB37ZfQQTUe3rWhe/03Eja4O1tm2CJAU6o/v+Lq4C404mYmqiSqlPrK9lJclR8ymX6Vgc5kPdmbqL3yuwOHZIETrF6lSZrbAZ+B7KPs+3HGZE9+3cdNAMl2wE+hbkEq4XKqY4IIv4NAiZ1+XAtPreUOzrcWFfXPsN+ArpYmXv6RtvwToVw31Va5i8r0wQdQj8Eu9fgcpp0JD4YMJHk8nqC1MpsviDXYPMio35QAigcj9hqYh654FcwKvtGF82QakFYEUUuqyJx2gSTRrOBzZ9tKnByb9Qlk+8Pqx1aBiGaCjwiIP15av/wWl79eHnpm5gxNJci5Jw/REHkUzi5bcD9m2ZEjYGJSWAKzeLZ3Cw3/jRrYgQPJjrXrhhVE0kxPnFVVFlnI+NkXQRx6snw2OZjWBG4cn8+E+1Lg+xYbgeq8AVZPvTWLlccowOVcDOFZVpsuAUQIZRB/8rVlCmseuaSwhKjGCzAVZegwwbT29c= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:18 np0005546909 python3[4562]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:19 np0005546909 python3[4661]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:04:19 np0005546909 python3[4734]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764932658.861529-207-256986111877061/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=61705b5911bf401d89a5c7fe2bc3b5ac_id_rsa follow=False checksum=151ba3e6330bfdea1541c5df9b33a50aaf84a208 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:20 np0005546909 python3[4857]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:04:20 np0005546909 python3[4928]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764932659.7510955-240-49647840172004/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=61705b5911bf401d89a5c7fe2bc3b5ac_id_rsa.pub follow=False checksum=a840b3e226ec9e3618104f55c7f7555a733047f5 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:21 np0005546909 python3[4976]: ansible-ping Invoked with data=pong
Dec  5 06:04:22 np0005546909 python3[5000]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:04:24 np0005546909 python3[5060]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec  5 06:04:27 np0005546909 python3[5092]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:27 np0005546909 python3[5116]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:27 np0005546909 python3[5140]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:28 np0005546909 python3[5164]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:28 np0005546909 python3[5188]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:28 np0005546909 python3[5212]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:30 np0005546909 python3[5238]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:30 np0005546909 python3[5316]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:04:31 np0005546909 python3[5389]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764932670.2422287-21-263598303897381/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:31 np0005546909 python3[5437]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:31 np0005546909 python3[5461]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:32 np0005546909 python3[5485]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:32 np0005546909 python3[5509]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:32 np0005546909 python3[5533]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:33 np0005546909 python3[5557]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:33 np0005546909 python3[5581]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:33 np0005546909 python3[5605]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:33 np0005546909 python3[5629]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:34 np0005546909 python3[5653]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:34 np0005546909 python3[5677]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:34 np0005546909 python3[5701]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:35 np0005546909 python3[5725]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:35 np0005546909 python3[5749]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:35 np0005546909 python3[5773]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:35 np0005546909 python3[5797]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:36 np0005546909 python3[5823]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:36 np0005546909 python3[5847]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:36 np0005546909 python3[5871]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:37 np0005546909 python3[5895]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:37 np0005546909 python3[5919]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:37 np0005546909 python3[5943]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:37 np0005546909 python3[5967]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:38 np0005546909 python3[5991]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:38 np0005546909 python3[6015]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:38 np0005546909 python3[6039]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:04:41 np0005546909 python3[6065]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  5 06:04:41 np0005546909 systemd[1]: Starting Time & Date Service...
Dec  5 06:04:41 np0005546909 systemd[1]: Started Time & Date Service.
Dec  5 06:04:41 np0005546909 systemd-timedated[6067]: Changed time zone to 'UTC' (UTC).
Dec  5 06:04:41 np0005546909 python3[6096]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:42 np0005546909 python3[6172]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:04:42 np0005546909 python3[6243]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764932681.9862192-153-249807561528417/source _original_basename=tmpbkrn64fr follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:43 np0005546909 python3[6343]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:04:43 np0005546909 python3[6414]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764932682.8548641-183-255391387919749/source _original_basename=tmpcmgjegsd follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:44 np0005546909 python3[6516]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:04:44 np0005546909 python3[6589]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764932684.003452-231-102883948401276/source _original_basename=tmphwegjqjq follow=False checksum=9afea3fa7e450257b25577284f0f4f0dfca88d28 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:45 np0005546909 python3[6637]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:04:45 np0005546909 python3[6663]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:04:45 np0005546909 python3[6743]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:04:46 np0005546909 python3[6816]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764932685.6772873-273-119285366837967/source _original_basename=tmpynbzqnj8 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:04:46 np0005546909 python3[6867]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-4bf7-d06f-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:04:47 np0005546909 python3[6895]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-4bf7-d06f-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec  5 06:04:48 np0005546909 python3[6923]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:05:10 np0005546909 python3[6949]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:05:11 np0005546909 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  5 06:05:46 np0005546909 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  5 06:05:46 np0005546909 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec  5 06:05:46 np0005546909 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec  5 06:05:46 np0005546909 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec  5 06:05:46 np0005546909 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec  5 06:05:46 np0005546909 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec  5 06:05:46 np0005546909 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec  5 06:05:46 np0005546909 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec  5 06:05:46 np0005546909 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec  5 06:05:46 np0005546909 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec  5 06:05:46 np0005546909 NetworkManager[855]: <info>  [1764932746.6721] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  5 06:05:46 np0005546909 systemd-udevd[6955]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:05:46 np0005546909 NetworkManager[855]: <info>  [1764932746.6939] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:05:46 np0005546909 NetworkManager[855]: <info>  [1764932746.6972] settings: (eth1): created default wired connection 'Wired connection 1'
Dec  5 06:05:46 np0005546909 NetworkManager[855]: <info>  [1764932746.6976] device (eth1): carrier: link connected
Dec  5 06:05:46 np0005546909 NetworkManager[855]: <info>  [1764932746.6978] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  5 06:05:46 np0005546909 NetworkManager[855]: <info>  [1764932746.6986] policy: auto-activating connection 'Wired connection 1' (93f975fe-2181-3e88-a16c-de115f1f749a)
Dec  5 06:05:46 np0005546909 NetworkManager[855]: <info>  [1764932746.6993] device (eth1): Activation: starting connection 'Wired connection 1' (93f975fe-2181-3e88-a16c-de115f1f749a)
Dec  5 06:05:46 np0005546909 NetworkManager[855]: <info>  [1764932746.6994] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:05:46 np0005546909 NetworkManager[855]: <info>  [1764932746.6997] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:05:46 np0005546909 NetworkManager[855]: <info>  [1764932746.7001] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:05:46 np0005546909 NetworkManager[855]: <info>  [1764932746.7006] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:05:47 np0005546909 python3[6981]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-8d79-8267-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:05:54 np0005546909 python3[7063]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:05:54 np0005546909 python3[7136]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764932754.1670415-102-53616069002325/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=539d5bbbf3bac426eed0d945c812bcc01cd40bad backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:05:55 np0005546909 python3[7186]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:05:55 np0005546909 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  5 06:05:55 np0005546909 systemd[1]: Stopped Network Manager Wait Online.
Dec  5 06:05:55 np0005546909 systemd[1]: Stopping Network Manager Wait Online...
Dec  5 06:05:55 np0005546909 NetworkManager[855]: <info>  [1764932755.7666] caught SIGTERM, shutting down normally.
Dec  5 06:05:55 np0005546909 systemd[1]: Stopping Network Manager...
Dec  5 06:05:55 np0005546909 NetworkManager[855]: <info>  [1764932755.7676] dhcp4 (eth0): canceled DHCP transaction
Dec  5 06:05:55 np0005546909 NetworkManager[855]: <info>  [1764932755.7677] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:05:55 np0005546909 NetworkManager[855]: <info>  [1764932755.7677] dhcp4 (eth0): state changed no lease
Dec  5 06:05:55 np0005546909 NetworkManager[855]: <info>  [1764932755.7679] manager: NetworkManager state is now CONNECTING
Dec  5 06:05:55 np0005546909 NetworkManager[855]: <info>  [1764932755.7800] dhcp4 (eth1): canceled DHCP transaction
Dec  5 06:05:55 np0005546909 NetworkManager[855]: <info>  [1764932755.7801] dhcp4 (eth1): state changed no lease
Dec  5 06:05:55 np0005546909 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  5 06:05:55 np0005546909 NetworkManager[855]: <info>  [1764932755.7863] exiting (success)
Dec  5 06:05:55 np0005546909 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  5 06:05:55 np0005546909 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  5 06:05:55 np0005546909 systemd[1]: Stopped Network Manager.
Dec  5 06:05:55 np0005546909 systemd[1]: Starting Network Manager...
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.8537] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:f0f69436-bbfa-48e7-b73e-9b22f091bec6)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.8543] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.8611] manager[0x5623e83a1070]: monitoring kernel firmware directory '/lib/firmware'.
Dec  5 06:05:55 np0005546909 systemd[1]: Starting Hostname Service...
Dec  5 06:05:55 np0005546909 systemd[1]: Started Hostname Service.
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9324] hostname: hostname: using hostnamed
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9325] hostname: static hostname changed from (none) to "np0005546909.novalocal"
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9333] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9341] manager[0x5623e83a1070]: rfkill: Wi-Fi hardware radio set enabled
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9341] manager[0x5623e83a1070]: rfkill: WWAN hardware radio set enabled
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9388] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9388] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9390] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9391] manager: Networking is enabled by state file
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9395] settings: Loaded settings plugin: keyfile (internal)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9402] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9444] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9463] dhcp: init: Using DHCP client 'internal'
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9468] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9477] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9488] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9504] device (lo): Activation: starting connection 'lo' (0f41f2b7-1648-4484-8c07-96c2526b8b00)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9515] device (eth0): carrier: link connected
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9523] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9532] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9533] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9544] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9555] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9564] device (eth1): carrier: link connected
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9571] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9579] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (93f975fe-2181-3e88-a16c-de115f1f749a) (indicated)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9581] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9591] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9603] device (eth1): Activation: starting connection 'Wired connection 1' (93f975fe-2181-3e88-a16c-de115f1f749a)
Dec  5 06:05:55 np0005546909 systemd[1]: Started Network Manager.
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9614] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9623] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9627] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9629] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9633] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9637] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9641] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9645] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9650] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9660] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9664] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9679] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9684] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9713] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9722] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  5 06:05:55 np0005546909 NetworkManager[7198]: <info>  [1764932755.9730] device (lo): Activation: successful, device activated.
Dec  5 06:05:55 np0005546909 systemd[1]: Starting Network Manager Wait Online...
Dec  5 06:05:56 np0005546909 python3[7251]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-8d79-8267-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:05:56 np0005546909 NetworkManager[7198]: <info>  [1764932756.8667] dhcp4 (eth0): state changed new lease, address=38.102.83.241
Dec  5 06:05:56 np0005546909 NetworkManager[7198]: <info>  [1764932756.8682] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  5 06:05:56 np0005546909 NetworkManager[7198]: <info>  [1764932756.8766] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  5 06:05:56 np0005546909 NetworkManager[7198]: <info>  [1764932756.8814] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  5 06:05:56 np0005546909 NetworkManager[7198]: <info>  [1764932756.8815] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  5 06:05:56 np0005546909 NetworkManager[7198]: <info>  [1764932756.8819] manager: NetworkManager state is now CONNECTED_SITE
Dec  5 06:05:56 np0005546909 NetworkManager[7198]: <info>  [1764932756.8823] device (eth0): Activation: successful, device activated.
Dec  5 06:05:56 np0005546909 NetworkManager[7198]: <info>  [1764932756.8830] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  5 06:06:06 np0005546909 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  5 06:06:25 np0005546909 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.3794] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  5 06:06:41 np0005546909 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  5 06:06:41 np0005546909 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4155] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4158] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4169] device (eth1): Activation: successful, device activated.
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4177] manager: startup complete
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4182] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <warn>  [1764932801.4189] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4199] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec  5 06:06:41 np0005546909 systemd[1]: Finished Network Manager Wait Online.
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4322] dhcp4 (eth1): canceled DHCP transaction
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4323] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4323] dhcp4 (eth1): state changed no lease
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4340] policy: auto-activating connection 'ci-private-network' (67f3aebf-819d-5f9b-8650-6c559580f88c)
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4348] device (eth1): Activation: starting connection 'ci-private-network' (67f3aebf-819d-5f9b-8650-6c559580f88c)
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4349] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4352] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4362] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4372] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4824] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4826] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:06:41 np0005546909 NetworkManager[7198]: <info>  [1764932801.4832] device (eth1): Activation: successful, device activated.
Dec  5 06:06:51 np0005546909 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  5 06:06:52 np0005546909 python3[7381]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:06:52 np0005546909 python3[7454]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764932812.3509858-259-42282614824109/source _original_basename=tmpr6a0i03o follow=False checksum=9d1ae235b500c1f86ea6820f43fa64d4d77947c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:07:05 np0005546909 systemd[4300]: Starting Mark boot as successful...
Dec  5 06:07:05 np0005546909 systemd[4300]: Finished Mark boot as successful.
Dec  5 06:07:53 np0005546909 systemd-logind[792]: Session 1 logged out. Waiting for processes to exit.
Dec  5 06:10:05 np0005546909 systemd[4300]: Created slice User Background Tasks Slice.
Dec  5 06:10:05 np0005546909 systemd[4300]: Starting Cleanup of User's Temporary Files and Directories...
Dec  5 06:10:05 np0005546909 systemd[4300]: Finished Cleanup of User's Temporary Files and Directories.
Dec  5 06:14:09 np0005546909 systemd-logind[792]: New session 3 of user zuul.
Dec  5 06:14:09 np0005546909 systemd[1]: Started Session 3 of User zuul.
Dec  5 06:14:09 np0005546909 python3[7541]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-9a9b-9f2e-000000001cdc-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:14:09 np0005546909 python3[7570]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:14:10 np0005546909 python3[7596]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:14:10 np0005546909 python3[7622]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:14:10 np0005546909 python3[7648]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:14:11 np0005546909 python3[7674]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:14:11 np0005546909 python3[7752]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:14:12 np0005546909 python3[7825]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764933251.4433565-480-270729542744686/source _original_basename=tmpnqxdp_4x follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:14:13 np0005546909 python3[7875]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:14:13 np0005546909 systemd[1]: Reloading.
Dec  5 06:14:13 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:14:14 np0005546909 python3[7931]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec  5 06:14:15 np0005546909 python3[7957]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:14:15 np0005546909 python3[7985]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:14:15 np0005546909 python3[8013]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:14:15 np0005546909 python3[8041]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:14:16 np0005546909 python3[8068]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-9a9b-9f2e-000000001ce3-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:14:16 np0005546909 python3[8098]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  5 06:14:18 np0005546909 systemd[1]: session-3.scope: Deactivated successfully.
Dec  5 06:14:18 np0005546909 systemd[1]: session-3.scope: Consumed 3.918s CPU time.
Dec  5 06:14:18 np0005546909 systemd-logind[792]: Session 3 logged out. Waiting for processes to exit.
Dec  5 06:14:18 np0005546909 systemd-logind[792]: Removed session 3.
Dec  5 06:14:20 np0005546909 systemd-logind[792]: New session 4 of user zuul.
Dec  5 06:14:20 np0005546909 systemd[1]: Started Session 4 of User zuul.
Dec  5 06:14:20 np0005546909 python3[8133]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  5 06:14:34 np0005546909 kernel: SELinux:  Converting 385 SID table entries...
Dec  5 06:14:34 np0005546909 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:14:34 np0005546909 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:14:34 np0005546909 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:14:34 np0005546909 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:14:34 np0005546909 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:14:34 np0005546909 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:14:34 np0005546909 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:14:44 np0005546909 kernel: SELinux:  Converting 385 SID table entries...
Dec  5 06:14:44 np0005546909 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:14:44 np0005546909 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:14:44 np0005546909 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:14:44 np0005546909 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:14:44 np0005546909 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:14:44 np0005546909 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:14:44 np0005546909 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:14:54 np0005546909 kernel: SELinux:  Converting 385 SID table entries...
Dec  5 06:14:54 np0005546909 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:14:54 np0005546909 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:14:54 np0005546909 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:14:54 np0005546909 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:14:54 np0005546909 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:14:54 np0005546909 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:14:54 np0005546909 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:14:55 np0005546909 setsebool[8200]: The virt_use_nfs policy boolean was changed to 1 by root
Dec  5 06:14:55 np0005546909 setsebool[8200]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec  5 06:15:07 np0005546909 kernel: SELinux:  Converting 388 SID table entries...
Dec  5 06:15:07 np0005546909 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:15:07 np0005546909 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:15:07 np0005546909 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:15:07 np0005546909 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:15:07 np0005546909 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:15:07 np0005546909 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:15:07 np0005546909 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:15:25 np0005546909 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  5 06:15:25 np0005546909 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 06:15:25 np0005546909 systemd[1]: Starting man-db-cache-update.service...
Dec  5 06:15:25 np0005546909 systemd[1]: Reloading.
Dec  5 06:15:25 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:15:25 np0005546909 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 06:15:45 np0005546909 irqbalance[790]: Cannot change IRQ 27 affinity: Operation not permitted
Dec  5 06:15:45 np0005546909 irqbalance[790]: IRQ 27 affinity is now unmanaged
Dec  5 06:15:51 np0005546909 python3[21365]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-af62-e437-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:15:52 np0005546909 kernel: evm: overlay not supported
Dec  5 06:15:52 np0005546909 systemd[4300]: Starting D-Bus User Message Bus...
Dec  5 06:15:52 np0005546909 dbus-broker-launch[21795]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec  5 06:15:52 np0005546909 dbus-broker-launch[21795]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec  5 06:15:52 np0005546909 systemd[4300]: Started D-Bus User Message Bus.
Dec  5 06:15:52 np0005546909 dbus-broker-lau[21795]: Ready
Dec  5 06:15:52 np0005546909 systemd[4300]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  5 06:15:52 np0005546909 systemd[4300]: Created slice Slice /user.
Dec  5 06:15:52 np0005546909 systemd[4300]: podman-21723.scope: unit configures an IP firewall, but not running as root.
Dec  5 06:15:52 np0005546909 systemd[4300]: (This warning is only shown for the first unit using IP firewalling.)
Dec  5 06:15:52 np0005546909 systemd[4300]: Started podman-21723.scope.
Dec  5 06:15:52 np0005546909 systemd[4300]: Started podman-pause-972d2d54.scope.
Dec  5 06:15:54 np0005546909 python3[22486]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.150:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.150:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:15:54 np0005546909 python3[22486]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec  5 06:15:54 np0005546909 systemd[1]: session-4.scope: Deactivated successfully.
Dec  5 06:15:54 np0005546909 systemd[1]: session-4.scope: Consumed 1min 2.340s CPU time.
Dec  5 06:15:54 np0005546909 systemd-logind[792]: Session 4 logged out. Waiting for processes to exit.
Dec  5 06:15:54 np0005546909 systemd-logind[792]: Removed session 4.
Dec  5 06:16:14 np0005546909 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 06:16:14 np0005546909 systemd[1]: Finished man-db-cache-update.service.
Dec  5 06:16:14 np0005546909 systemd[1]: man-db-cache-update.service: Consumed 1min 985ms CPU time.
Dec  5 06:16:14 np0005546909 systemd[1]: run-r05ad47f3fc23409bba8855ead1a4753d.service: Deactivated successfully.
Dec  5 06:16:20 np0005546909 systemd-logind[792]: New session 5 of user zuul.
Dec  5 06:16:20 np0005546909 systemd[1]: Started Session 5 of User zuul.
Dec  5 06:16:20 np0005546909 python3[29712]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPxMqPca2AFFkthc+SJB5KQce9qhUGI8TmxjbFm+QVrTZb5Aso+AhgnMC72fIw5eFURySQzpF/Zz36QB1PAho/E= zuul@np0005546908.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:16:21 np0005546909 python3[29738]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPxMqPca2AFFkthc+SJB5KQce9qhUGI8TmxjbFm+QVrTZb5Aso+AhgnMC72fIw5eFURySQzpF/Zz36QB1PAho/E= zuul@np0005546908.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:16:22 np0005546909 python3[29766]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005546909.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec  5 06:16:22 np0005546909 python3[29800]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPxMqPca2AFFkthc+SJB5KQce9qhUGI8TmxjbFm+QVrTZb5Aso+AhgnMC72fIw5eFURySQzpF/Zz36QB1PAho/E= zuul@np0005546908.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  5 06:16:23 np0005546909 python3[29878]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:16:23 np0005546909 python3[29951]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764933383.0035822-135-108564735354486/source _original_basename=tmp3x4q2l93 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:16:24 np0005546909 python3[30001]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Dec  5 06:16:24 np0005546909 systemd[1]: Starting Hostname Service...
Dec  5 06:16:24 np0005546909 systemd[1]: Started Hostname Service.
Dec  5 06:16:24 np0005546909 systemd-hostnamed[30006]: Changed pretty hostname to 'compute-0'
Dec  5 06:16:24 np0005546909 systemd-hostnamed[30006]: Hostname set to <compute-0> (static)
Dec  5 06:16:24 np0005546909 NetworkManager[7198]: <info>  [1764933384.8871] hostname: static hostname changed from "np0005546909.novalocal" to "compute-0"
Dec  5 06:16:24 np0005546909 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  5 06:16:24 np0005546909 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  5 06:16:25 np0005546909 systemd[1]: session-5.scope: Deactivated successfully.
Dec  5 06:16:25 np0005546909 systemd[1]: session-5.scope: Consumed 2.548s CPU time.
Dec  5 06:16:25 np0005546909 systemd-logind[792]: Session 5 logged out. Waiting for processes to exit.
Dec  5 06:16:25 np0005546909 systemd-logind[792]: Removed session 5.
Dec  5 06:16:34 np0005546909 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  5 06:16:54 np0005546909 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  5 06:19:01 np0005546909 systemd[1]: Starting Cleanup of Temporary Directories...
Dec  5 06:19:01 np0005546909 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec  5 06:19:01 np0005546909 systemd[1]: Finished Cleanup of Temporary Directories.
Dec  5 06:19:01 np0005546909 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec  5 06:20:58 np0005546909 systemd-logind[792]: New session 6 of user zuul.
Dec  5 06:20:58 np0005546909 systemd[1]: Started Session 6 of User zuul.
Dec  5 06:20:59 np0005546909 python3[30120]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:21:00 np0005546909 python3[30236]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:21:01 np0005546909 python3[30309]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:21:01 np0005546909 python3[30335]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:21:02 np0005546909 python3[30408]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:21:02 np0005546909 python3[30434]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:21:02 np0005546909 python3[30507]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:21:02 np0005546909 python3[30533]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:21:03 np0005546909 python3[30606]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:21:03 np0005546909 python3[30632]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:21:03 np0005546909 python3[30705]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:21:04 np0005546909 python3[30731]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:21:04 np0005546909 python3[30804]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:21:04 np0005546909 python3[30830]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  5 06:21:05 np0005546909 python3[30903]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764933660.5983768-33580-18402085169850/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:22:43 np0005546909 python3[30961]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:27:43 np0005546909 systemd[1]: session-6.scope: Deactivated successfully.
Dec  5 06:27:43 np0005546909 systemd[1]: session-6.scope: Consumed 5.276s CPU time.
Dec  5 06:27:43 np0005546909 systemd-logind[792]: Session 6 logged out. Waiting for processes to exit.
Dec  5 06:27:43 np0005546909 systemd-logind[792]: Removed session 6.
Dec  5 06:34:23 np0005546909 systemd-logind[792]: New session 7 of user zuul.
Dec  5 06:34:23 np0005546909 systemd[1]: Started Session 7 of User zuul.
Dec  5 06:34:24 np0005546909 python3.9[31135]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:34:26 np0005546909 python3.9[31316]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:34:33 np0005546909 systemd[1]: session-7.scope: Deactivated successfully.
Dec  5 06:34:33 np0005546909 systemd[1]: session-7.scope: Consumed 8.085s CPU time.
Dec  5 06:34:33 np0005546909 systemd-logind[792]: Session 7 logged out. Waiting for processes to exit.
Dec  5 06:34:33 np0005546909 systemd-logind[792]: Removed session 7.
Dec  5 06:34:39 np0005546909 systemd-logind[792]: New session 8 of user zuul.
Dec  5 06:34:39 np0005546909 systemd[1]: Started Session 8 of User zuul.
Dec  5 06:34:40 np0005546909 python3.9[31528]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:34:40 np0005546909 systemd[1]: session-8.scope: Deactivated successfully.
Dec  5 06:34:40 np0005546909 systemd-logind[792]: Session 8 logged out. Waiting for processes to exit.
Dec  5 06:34:40 np0005546909 systemd-logind[792]: Removed session 8.
Dec  5 06:34:59 np0005546909 systemd-logind[792]: New session 9 of user zuul.
Dec  5 06:34:59 np0005546909 systemd[1]: Started Session 9 of User zuul.
Dec  5 06:35:00 np0005546909 python3.9[31710]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  5 06:35:01 np0005546909 python3.9[31884]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:35:02 np0005546909 python3.9[32036]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:35:03 np0005546909 python3.9[32189]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:35:03 np0005546909 python3.9[32343]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:35:04 np0005546909 python3.9[32495]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:35:05 np0005546909 python3.9[32618]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934504.1262367-73-61209407418530/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:35:06 np0005546909 python3.9[32770]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:35:07 np0005546909 python3.9[32926]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:35:08 np0005546909 python3.9[33078]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:35:08 np0005546909 python3.9[33229]: ansible-ansible.builtin.service_facts Invoked
Dec  5 06:35:14 np0005546909 python3.9[33482]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:35:15 np0005546909 irqbalance[790]: Cannot change IRQ 26 affinity: Operation not permitted
Dec  5 06:35:15 np0005546909 irqbalance[790]: IRQ 26 affinity is now unmanaged
Dec  5 06:35:15 np0005546909 python3.9[33632]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:35:16 np0005546909 python3.9[33786]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:35:17 np0005546909 python3.9[33944]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:35:18 np0005546909 python3.9[34028]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:36:02 np0005546909 systemd[1]: Reloading.
Dec  5 06:36:02 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:36:02 np0005546909 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec  5 06:36:02 np0005546909 systemd[1]: Reloading.
Dec  5 06:36:02 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:36:02 np0005546909 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec  5 06:36:02 np0005546909 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec  5 06:36:02 np0005546909 systemd[1]: Reloading.
Dec  5 06:36:03 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:36:03 np0005546909 systemd[1]: Starting dnf makecache...
Dec  5 06:36:03 np0005546909 systemd[1]: Listening on LVM2 poll daemon socket.
Dec  5 06:36:03 np0005546909 dnf[34317]: Failed determining last makecache time.
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-openstack-barbican-42b4c41831408a8e323 165 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 214 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-openstack-cinder-1c00d6490d88e436f26ef 197 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-python-stevedore-c4acc5639fd2329372142 196 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-python-cloudkitty-tests-tempest-2c80f8 184 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-os-net-config-d0cedbdb788d43e5c7551df5 187 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 193 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-python-designate-tests-tempest-347fdbc 175 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-openstack-glance-1fd12c29b339f30fe823e 177 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 170 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-openstack-manila-3c01b7181572c95dac462 185 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-python-whitebox-neutron-tests-tempest- 174 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-openstack-octavia-ba397f07a7331190208c 175 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-openstack-watcher-c014f81a8647287f6dcc 177 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-ansible-config_template-5ccaa22121a7ff 185 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 181 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-openstack-swift-dc98a8463506ac520c469a 182 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-python-tempestconf-8515371b7cceebd4282 179 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: delorean-openstack-heat-ui-013accbfd179753bc3f0 180 kB/s | 3.0 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: CentOS Stream 9 - BaseOS                         78 kB/s | 7.3 kB     00:00
Dec  5 06:36:03 np0005546909 dnf[34317]: CentOS Stream 9 - AppStream                      68 kB/s | 7.4 kB     00:00
Dec  5 06:36:04 np0005546909 dnf[34317]: CentOS Stream 9 - CRB                            83 kB/s | 7.2 kB     00:00
Dec  5 06:36:04 np0005546909 dnf[34317]: CentOS Stream 9 - Extras packages                67 kB/s | 8.3 kB     00:00
Dec  5 06:36:04 np0005546909 dnf[34317]: dlrn-antelope-testing                           102 kB/s | 3.0 kB     00:00
Dec  5 06:36:04 np0005546909 dnf[34317]: dlrn-antelope-build-deps                        111 kB/s | 3.0 kB     00:00
Dec  5 06:36:04 np0005546909 dnf[34317]: centos9-rabbitmq                                 92 kB/s | 3.0 kB     00:00
Dec  5 06:36:04 np0005546909 dnf[34317]: centos9-storage                                 108 kB/s | 3.0 kB     00:00
Dec  5 06:36:04 np0005546909 dnf[34317]: centos9-opstools                                 97 kB/s | 3.0 kB     00:00
Dec  5 06:36:04 np0005546909 dnf[34317]: NFV SIG OpenvSwitch                              23 kB/s | 3.0 kB     00:00
Dec  5 06:36:04 np0005546909 dnf[34317]: repo-setup-centos-appstream                     126 kB/s | 4.4 kB     00:00
Dec  5 06:36:04 np0005546909 dnf[34317]: repo-setup-centos-baseos                        169 kB/s | 3.9 kB     00:00
Dec  5 06:36:04 np0005546909 dnf[34317]: repo-setup-centos-highavailability              176 kB/s | 3.9 kB     00:00
Dec  5 06:36:04 np0005546909 dnf[34317]: repo-setup-centos-powertools                    219 kB/s | 4.3 kB     00:00
Dec  5 06:36:05 np0005546909 dnf[34317]: Extra Packages for Enterprise Linux 9 - x86_64  241 kB/s |  30 kB     00:00
Dec  5 06:36:05 np0005546909 dnf[34317]: Metadata cache created.
Dec  5 06:36:05 np0005546909 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec  5 06:36:05 np0005546909 systemd[1]: Finished dnf makecache.
Dec  5 06:36:05 np0005546909 systemd[1]: dnf-makecache.service: Consumed 1.765s CPU time.
Dec  5 06:37:07 np0005546909 kernel: SELinux:  Converting 2718 SID table entries...
Dec  5 06:37:07 np0005546909 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:37:07 np0005546909 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:37:07 np0005546909 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:37:07 np0005546909 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:37:07 np0005546909 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:37:07 np0005546909 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:37:07 np0005546909 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:37:07 np0005546909 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec  5 06:37:07 np0005546909 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 06:37:07 np0005546909 systemd[1]: Starting man-db-cache-update.service...
Dec  5 06:37:07 np0005546909 systemd[1]: Reloading.
Dec  5 06:37:07 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:37:07 np0005546909 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 06:37:08 np0005546909 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 06:37:08 np0005546909 systemd[1]: Finished man-db-cache-update.service.
Dec  5 06:37:08 np0005546909 systemd[1]: man-db-cache-update.service: Consumed 1.218s CPU time.
Dec  5 06:37:08 np0005546909 systemd[1]: run-rb33580635abc4a4da3a2d8712f8bbec5.service: Deactivated successfully.
Dec  5 06:37:08 np0005546909 python3.9[35600]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:37:11 np0005546909 python3.9[35881]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  5 06:37:11 np0005546909 python3.9[36033]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  5 06:37:14 np0005546909 python3.9[36186]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:37:15 np0005546909 python3.9[36339]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  5 06:37:16 np0005546909 python3.9[36491]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:37:17 np0005546909 python3.9[36643]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:37:18 np0005546909 python3.9[36766]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934636.9991395-236-241056937324363/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:37:19 np0005546909 python3.9[36918]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:37:20 np0005546909 python3.9[37070]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:37:23 np0005546909 python3.9[37223]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:37:24 np0005546909 python3.9[37375]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  5 06:37:24 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 06:37:26 np0005546909 python3.9[37529]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  5 06:37:27 np0005546909 python3.9[37687]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  5 06:37:28 np0005546909 python3.9[37847]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  5 06:37:28 np0005546909 python3.9[38000]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  5 06:37:29 np0005546909 python3.9[38158]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  5 06:37:30 np0005546909 python3.9[38310]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:37:32 np0005546909 python3.9[38463]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:37:33 np0005546909 python3.9[38615]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:37:34 np0005546909 python3.9[38738]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764934653.0928617-355-256812939155680/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:37:35 np0005546909 python3.9[38890]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:37:35 np0005546909 systemd[1]: Starting Load Kernel Modules...
Dec  5 06:37:35 np0005546909 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec  5 06:37:35 np0005546909 kernel: Bridge firewalling registered
Dec  5 06:37:35 np0005546909 systemd-modules-load[38894]: Inserted module 'br_netfilter'
Dec  5 06:37:35 np0005546909 systemd[1]: Finished Load Kernel Modules.
Dec  5 06:37:36 np0005546909 python3.9[39049]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:37:36 np0005546909 python3.9[39172]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764934655.6348865-378-104020133287761/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:37:37 np0005546909 python3.9[39324]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:37:40 np0005546909 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Dec  5 06:37:40 np0005546909 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Dec  5 06:37:40 np0005546909 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 06:37:40 np0005546909 systemd[1]: Starting man-db-cache-update.service...
Dec  5 06:37:40 np0005546909 systemd[1]: Reloading.
Dec  5 06:37:40 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:37:41 np0005546909 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 06:37:42 np0005546909 python3.9[40680]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:37:43 np0005546909 python3.9[41645]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  5 06:37:43 np0005546909 python3.9[42411]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:37:44 np0005546909 python3.9[43282]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:37:44 np0005546909 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  5 06:37:44 np0005546909 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 06:37:44 np0005546909 systemd[1]: Finished man-db-cache-update.service.
Dec  5 06:37:44 np0005546909 systemd[1]: man-db-cache-update.service: Consumed 4.971s CPU time.
Dec  5 06:37:44 np0005546909 systemd[1]: run-r356ff7713f124989b473af8497147361.service: Deactivated successfully.
Dec  5 06:37:45 np0005546909 systemd[1]: Starting Authorization Manager...
Dec  5 06:37:45 np0005546909 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  5 06:37:45 np0005546909 polkitd[43701]: Started polkitd version 0.117
Dec  5 06:37:45 np0005546909 systemd[1]: Started Authorization Manager.
Dec  5 06:37:46 np0005546909 python3.9[43871]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:37:46 np0005546909 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  5 06:37:46 np0005546909 systemd[1]: tuned.service: Deactivated successfully.
Dec  5 06:37:46 np0005546909 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  5 06:37:46 np0005546909 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  5 06:37:46 np0005546909 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  5 06:37:47 np0005546909 python3.9[44033]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  5 06:37:49 np0005546909 python3.9[44185]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:37:49 np0005546909 systemd[1]: Reloading.
Dec  5 06:37:50 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:37:50 np0005546909 python3.9[44373]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:37:51 np0005546909 systemd[1]: Reloading.
Dec  5 06:37:51 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:37:52 np0005546909 python3.9[44563]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:37:52 np0005546909 python3.9[44716]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:37:52 np0005546909 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec  5 06:37:53 np0005546909 python3.9[44869]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:37:55 np0005546909 python3.9[45031]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:37:56 np0005546909 python3.9[45184]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:37:56 np0005546909 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  5 06:37:56 np0005546909 systemd[1]: Stopped Apply Kernel Variables.
Dec  5 06:37:56 np0005546909 systemd[1]: Stopping Apply Kernel Variables...
Dec  5 06:37:56 np0005546909 systemd[1]: Starting Apply Kernel Variables...
Dec  5 06:37:56 np0005546909 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  5 06:37:56 np0005546909 systemd[1]: Finished Apply Kernel Variables.
Dec  5 06:37:57 np0005546909 systemd[1]: session-9.scope: Deactivated successfully.
Dec  5 06:37:57 np0005546909 systemd[1]: session-9.scope: Consumed 2min 17.023s CPU time.
Dec  5 06:37:57 np0005546909 systemd-logind[792]: Session 9 logged out. Waiting for processes to exit.
Dec  5 06:37:57 np0005546909 systemd-logind[792]: Removed session 9.
Dec  5 06:38:02 np0005546909 systemd-logind[792]: New session 10 of user zuul.
Dec  5 06:38:02 np0005546909 systemd[1]: Started Session 10 of User zuul.
Dec  5 06:38:04 np0005546909 python3.9[45367]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:38:05 np0005546909 python3.9[45521]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:38:06 np0005546909 python3.9[45677]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:38:07 np0005546909 python3.9[45828]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:38:08 np0005546909 python3.9[45984]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:38:09 np0005546909 python3.9[46068]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:38:11 np0005546909 python3.9[46221]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:38:12 np0005546909 python3.9[46392]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:38:13 np0005546909 python3.9[46544]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:38:13 np0005546909 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1772982915-merged.mount: Deactivated successfully.
Dec  5 06:38:13 np0005546909 podman[46545]: 2025-12-05 11:38:13.299616903 +0000 UTC m=+0.059632046 system refresh
Dec  5 06:38:14 np0005546909 python3.9[46708]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:38:14 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:38:15 np0005546909 python3.9[46831]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934693.5554507-109-274498842155679/.source.json follow=False _original_basename=podman_network_config.j2 checksum=7ff3ae51ba32067874fed07a7a999793302cd1b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:38:15 np0005546909 python3.9[46983]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:38:16 np0005546909 python3.9[47106]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764934695.2314103-124-93566517516550/.source.conf follow=False _original_basename=registries.conf.j2 checksum=88b6a52c62914061ba0322e1e0763af09791b362 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:38:17 np0005546909 python3.9[47258]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:38:18 np0005546909 python3.9[47410]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:38:18 np0005546909 python3.9[47562]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:38:19 np0005546909 python3.9[47714]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:38:20 np0005546909 python3.9[47864]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:38:21 np0005546909 python3.9[48018]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 06:38:23 np0005546909 python3.9[48171]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 06:38:26 np0005546909 python3.9[48331]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 06:38:28 np0005546909 python3.9[48484]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 06:38:30 np0005546909 python3.9[48637]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 06:38:32 np0005546909 python3.9[48793]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 06:38:40 np0005546909 python3.9[48962]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 06:38:43 np0005546909 python3.9[49115]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 06:38:54 np0005546909 python3.9[49452]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 06:38:56 np0005546909 python3.9[49608]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:38:57 np0005546909 python3.9[49783]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:38:57 np0005546909 python3.9[49906]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764934736.6732306-272-96495391628341/.source.json _original_basename=.lgkpuosj follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:38:58 np0005546909 python3.9[50058]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  5 06:38:58 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:01 np0005546909 systemd[1]: var-lib-containers-storage-overlay-compat1785861986-lower\x2dmapped.mount: Deactivated successfully.
Dec  5 06:39:04 np0005546909 podman[50070]: 2025-12-05 11:39:04.249958626 +0000 UTC m=+5.270625729 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  5 06:39:04 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:04 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:04 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:05 np0005546909 python3.9[50367]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  5 06:39:05 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:14 np0005546909 podman[50379]: 2025-12-05 11:39:14.901894501 +0000 UTC m=+9.385652312 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 06:39:14 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:14 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:15 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:16 np0005546909 python3.9[50704]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  5 06:39:16 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:17 np0005546909 podman[50716]: 2025-12-05 11:39:17.327395775 +0000 UTC m=+1.240997671 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  5 06:39:17 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:17 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:17 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:18 np0005546909 python3.9[50956]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  5 06:39:18 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:29 np0005546909 podman[50969]: 2025-12-05 11:39:29.722889517 +0000 UTC m=+11.322952254 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  5 06:39:29 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:29 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:29 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:31 np0005546909 python3.9[51257]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  5 06:39:31 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:33 np0005546909 podman[51269]: 2025-12-05 11:39:33.8991738 +0000 UTC m=+2.656184221 image pull 343ba269c9fe0a56d7572c8ca328dbce002017c4dd4986f43667971dd03085c2 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec  5 06:39:33 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:33 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:34 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:35 np0005546909 python3.9[51525]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  5 06:39:35 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:36 np0005546909 podman[51539]: 2025-12-05 11:39:36.251561527 +0000 UTC m=+1.190867474 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec  5 06:39:36 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:36 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:36 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:39:36 np0005546909 systemd[1]: session-10.scope: Deactivated successfully.
Dec  5 06:39:36 np0005546909 systemd[1]: session-10.scope: Consumed 1min 52.772s CPU time.
Dec  5 06:39:36 np0005546909 systemd-logind[792]: Session 10 logged out. Waiting for processes to exit.
Dec  5 06:39:36 np0005546909 systemd-logind[792]: Removed session 10.
Dec  5 06:39:44 np0005546909 systemd-logind[792]: New session 11 of user zuul.
Dec  5 06:39:44 np0005546909 systemd[1]: Started Session 11 of User zuul.
Dec  5 06:39:45 np0005546909 python3.9[51842]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:39:46 np0005546909 python3.9[51998]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  5 06:39:47 np0005546909 python3.9[52151]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  5 06:39:49 np0005546909 python3.9[52309]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  5 06:39:50 np0005546909 python3.9[52469]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:39:51 np0005546909 python3.9[52553]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 06:39:53 np0005546909 python3.9[52715]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:40:04 np0005546909 kernel: SELinux:  Converting 2731 SID table entries...
Dec  5 06:40:04 np0005546909 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:40:04 np0005546909 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:40:04 np0005546909 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:40:04 np0005546909 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:40:04 np0005546909 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:40:04 np0005546909 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:40:04 np0005546909 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:40:05 np0005546909 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec  5 06:40:05 np0005546909 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec  5 06:40:06 np0005546909 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 06:40:06 np0005546909 systemd[1]: Starting man-db-cache-update.service...
Dec  5 06:40:06 np0005546909 systemd[1]: Reloading.
Dec  5 06:40:06 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:40:06 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:40:07 np0005546909 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 06:40:07 np0005546909 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 06:40:07 np0005546909 systemd[1]: Finished man-db-cache-update.service.
Dec  5 06:40:07 np0005546909 systemd[1]: run-r65682e9384264f44bd70efc1f403fb72.service: Deactivated successfully.
Dec  5 06:40:08 np0005546909 python3.9[53813]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 06:40:08 np0005546909 systemd[1]: Reloading.
Dec  5 06:40:08 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:40:08 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:40:09 np0005546909 systemd[1]: Starting Open vSwitch Database Unit...
Dec  5 06:40:09 np0005546909 chown[53855]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec  5 06:40:09 np0005546909 ovs-ctl[53860]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec  5 06:40:09 np0005546909 ovs-ctl[53860]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec  5 06:40:09 np0005546909 ovs-ctl[53860]: Starting ovsdb-server [  OK  ]
Dec  5 06:40:09 np0005546909 ovs-vsctl[53909]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec  5 06:40:09 np0005546909 ovs-vsctl[53929]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"2686fa45-e88c-4058-8865-e810ceb89d95\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec  5 06:40:09 np0005546909 ovs-ctl[53860]: Configuring Open vSwitch system IDs [  OK  ]
Dec  5 06:40:09 np0005546909 ovs-vsctl[53935]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec  5 06:40:09 np0005546909 ovs-ctl[53860]: Enabling remote OVSDB managers [  OK  ]
Dec  5 06:40:09 np0005546909 systemd[1]: Started Open vSwitch Database Unit.
Dec  5 06:40:09 np0005546909 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec  5 06:40:09 np0005546909 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec  5 06:40:09 np0005546909 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec  5 06:40:09 np0005546909 kernel: openvswitch: Open vSwitch switching datapath
Dec  5 06:40:09 np0005546909 ovs-ctl[53979]: Inserting openvswitch module [  OK  ]
Dec  5 06:40:09 np0005546909 ovs-ctl[53948]: Starting ovs-vswitchd [  OK  ]
Dec  5 06:40:09 np0005546909 ovs-vsctl[53999]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec  5 06:40:09 np0005546909 ovs-ctl[53948]: Enabling remote OVSDB managers [  OK  ]
Dec  5 06:40:09 np0005546909 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec  5 06:40:09 np0005546909 systemd[1]: Starting Open vSwitch...
Dec  5 06:40:09 np0005546909 systemd[1]: Finished Open vSwitch.
Dec  5 06:40:10 np0005546909 python3.9[54151]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:40:11 np0005546909 python3.9[54303]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  5 06:40:12 np0005546909 kernel: SELinux:  Converting 2745 SID table entries...
Dec  5 06:40:12 np0005546909 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:40:12 np0005546909 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:40:12 np0005546909 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:40:12 np0005546909 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:40:12 np0005546909 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:40:12 np0005546909 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:40:12 np0005546909 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:40:14 np0005546909 python3.9[54458]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:40:14 np0005546909 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec  5 06:40:15 np0005546909 python3.9[54616]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:40:17 np0005546909 python3.9[54769]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:40:18 np0005546909 python3.9[55056]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  5 06:40:19 np0005546909 python3.9[55206]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:40:20 np0005546909 python3.9[55360]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:40:22 np0005546909 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 06:40:22 np0005546909 systemd[1]: Starting man-db-cache-update.service...
Dec  5 06:40:22 np0005546909 systemd[1]: Reloading.
Dec  5 06:40:22 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:40:22 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:40:22 np0005546909 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 06:40:22 np0005546909 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 06:40:22 np0005546909 systemd[1]: Finished man-db-cache-update.service.
Dec  5 06:40:22 np0005546909 systemd[1]: run-rfc0f268c0020497c97395bc9fcbed7ae.service: Deactivated successfully.
Dec  5 06:40:23 np0005546909 python3.9[55678]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:40:23 np0005546909 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  5 06:40:23 np0005546909 systemd[1]: Stopped Network Manager Wait Online.
Dec  5 06:40:23 np0005546909 systemd[1]: Stopping Network Manager Wait Online...
Dec  5 06:40:23 np0005546909 systemd[1]: Stopping Network Manager...
Dec  5 06:40:23 np0005546909 NetworkManager[7198]: <info>  [1764934823.8559] caught SIGTERM, shutting down normally.
Dec  5 06:40:23 np0005546909 NetworkManager[7198]: <info>  [1764934823.8572] dhcp4 (eth0): canceled DHCP transaction
Dec  5 06:40:23 np0005546909 NetworkManager[7198]: <info>  [1764934823.8572] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:40:23 np0005546909 NetworkManager[7198]: <info>  [1764934823.8573] dhcp4 (eth0): state changed no lease
Dec  5 06:40:23 np0005546909 NetworkManager[7198]: <info>  [1764934823.8574] manager: NetworkManager state is now CONNECTED_SITE
Dec  5 06:40:23 np0005546909 NetworkManager[7198]: <info>  [1764934823.8644] exiting (success)
Dec  5 06:40:23 np0005546909 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  5 06:40:23 np0005546909 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  5 06:40:23 np0005546909 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  5 06:40:23 np0005546909 systemd[1]: Stopped Network Manager.
Dec  5 06:40:23 np0005546909 systemd[1]: NetworkManager.service: Consumed 14.252s CPU time, 4.3M memory peak, read 0B from disk, written 36.0K to disk.
Dec  5 06:40:23 np0005546909 systemd[1]: Starting Network Manager...
Dec  5 06:40:23 np0005546909 NetworkManager[55691]: <info>  [1764934823.9526] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:f0f69436-bbfa-48e7-b73e-9b22f091bec6)
Dec  5 06:40:23 np0005546909 NetworkManager[55691]: <info>  [1764934823.9530] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  5 06:40:23 np0005546909 NetworkManager[55691]: <info>  [1764934823.9607] manager[0x55a7ba6c8090]: monitoring kernel firmware directory '/lib/firmware'.
Dec  5 06:40:23 np0005546909 systemd[1]: Starting Hostname Service...
Dec  5 06:40:24 np0005546909 systemd[1]: Started Hostname Service.
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0786] hostname: hostname: using hostnamed
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0786] hostname: static hostname changed from (none) to "compute-0"
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0791] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0796] manager[0x55a7ba6c8090]: rfkill: Wi-Fi hardware radio set enabled
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0797] manager[0x55a7ba6c8090]: rfkill: WWAN hardware radio set enabled
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0818] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0826] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0827] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0828] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0828] manager: Networking is enabled by state file
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0830] settings: Loaded settings plugin: keyfile (internal)
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0833] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0857] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0868] dhcp: init: Using DHCP client 'internal'
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0871] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0875] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0881] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0888] device (lo): Activation: starting connection 'lo' (0f41f2b7-1648-4484-8c07-96c2526b8b00)
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0894] device (eth0): carrier: link connected
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0898] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0902] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0903] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0908] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0915] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0920] device (eth1): carrier: link connected
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0923] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0927] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (67f3aebf-819d-5f9b-8650-6c559580f88c) (indicated)
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0928] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0932] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0938] device (eth1): Activation: starting connection 'ci-private-network' (67f3aebf-819d-5f9b-8650-6c559580f88c)
Dec  5 06:40:24 np0005546909 systemd[1]: Started Network Manager.
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0946] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0953] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0955] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0956] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0958] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0961] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0963] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0964] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0966] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0971] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0973] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0981] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.0992] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1003] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1005] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1009] device (lo): Activation: successful, device activated.
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1015] dhcp4 (eth0): state changed new lease, address=38.102.83.241
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1020] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1081] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1086] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1092] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1095] manager: NetworkManager state is now CONNECTED_LOCAL
Dec  5 06:40:24 np0005546909 systemd[1]: Starting Network Manager Wait Online...
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1097] device (eth1): Activation: successful, device activated.
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1129] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1130] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1133] manager: NetworkManager state is now CONNECTED_SITE
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1135] device (eth0): Activation: successful, device activated.
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1138] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  5 06:40:24 np0005546909 NetworkManager[55691]: <info>  [1764934824.1141] manager: startup complete
Dec  5 06:40:24 np0005546909 systemd[1]: Finished Network Manager Wait Online.
Dec  5 06:40:24 np0005546909 python3.9[55904]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:40:29 np0005546909 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 06:40:29 np0005546909 systemd[1]: Starting man-db-cache-update.service...
Dec  5 06:40:29 np0005546909 systemd[1]: Reloading.
Dec  5 06:40:30 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:40:30 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:40:30 np0005546909 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 06:40:30 np0005546909 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 06:40:30 np0005546909 systemd[1]: Finished man-db-cache-update.service.
Dec  5 06:40:30 np0005546909 systemd[1]: run-ra2880eb5d01b4bc281576c080b6847aa.service: Deactivated successfully.
Dec  5 06:40:31 np0005546909 python3.9[56365]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:40:32 np0005546909 python3.9[56517]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:40:33 np0005546909 python3.9[56671]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:40:34 np0005546909 python3.9[56823]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:40:34 np0005546909 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  5 06:40:34 np0005546909 python3.9[56975]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:40:35 np0005546909 python3.9[57127]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:40:36 np0005546909 python3.9[57279]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:40:37 np0005546909 python3.9[57402]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934835.906281-229-18206664741806/.source _original_basename=.lswvfp1y follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:40:37 np0005546909 python3.9[57554]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:40:38 np0005546909 python3.9[57706]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec  5 06:40:39 np0005546909 python3.9[57858]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:40:41 np0005546909 python3.9[58285]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec  5 06:40:43 np0005546909 ansible-async_wrapper.py[58460]: Invoked with j388437315305 300 /home/zuul/.ansible/tmp/ansible-tmp-1764934842.2626042-295-234296007961692/AnsiballZ_edpm_os_net_config.py _
Dec  5 06:40:43 np0005546909 ansible-async_wrapper.py[58463]: Starting module and watcher
Dec  5 06:40:43 np0005546909 ansible-async_wrapper.py[58463]: Start watching 58464 (300)
Dec  5 06:40:43 np0005546909 ansible-async_wrapper.py[58464]: Start module (58464)
Dec  5 06:40:43 np0005546909 ansible-async_wrapper.py[58460]: Return async_wrapper task started.
Dec  5 06:40:43 np0005546909 python3.9[58465]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec  5 06:40:43 np0005546909 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec  5 06:40:43 np0005546909 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec  5 06:40:43 np0005546909 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec  5 06:40:43 np0005546909 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec  5 06:40:43 np0005546909 kernel: cfg80211: failed to load regulatory.db
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1040] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1068] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1653] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1655] audit: op="connection-add" uuid="4ddb60d7-2a3c-4aa4-9561-51cffb00bfbe" name="br-ex-br" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1675] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1676] audit: op="connection-add" uuid="13c6488b-9eba-4cf8-953e-4c13e4705605" name="br-ex-port" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1695] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1696] audit: op="connection-add" uuid="1045c5c4-a78d-40b3-8826-cb9933e50aab" name="eth1-port" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1714] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1715] audit: op="connection-add" uuid="cb27fa5c-e07f-493f-8997-0129d5ca9f4d" name="vlan20-port" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1732] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1733] audit: op="connection-add" uuid="f4d91186-8b58-4d63-adfd-dd5722e9b835" name="vlan21-port" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1750] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1751] audit: op="connection-add" uuid="61eb692f-aa48-43ca-aa37-ad3fa95b4c5b" name="vlan22-port" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1774] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1794] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1795] audit: op="connection-add" uuid="c342017d-ca50-45e6-93f0-ae90b33fdcef" name="br-ex-if" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1873] audit: op="connection-update" uuid="67f3aebf-819d-5f9b-8650-6c559580f88c" name="ci-private-network" args="connection.timestamp,connection.master,connection.controller,connection.port-type,connection.slave-type,ovs-external-ids.data,ovs-interface.type,ipv4.method,ipv4.addresses,ipv4.routing-rules,ipv4.never-default,ipv4.dns,ipv4.routes,ipv6.method,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routing-rules,ipv6.dns,ipv6.routes" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1891] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1892] audit: op="connection-add" uuid="1e55f780-b484-4e60-a006-55c2c35ecc05" name="vlan20-if" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1910] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1912] audit: op="connection-add" uuid="f970735e-bc09-4f31-b1a0-4eda24e9d39f" name="vlan21-if" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1933] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1934] audit: op="connection-add" uuid="fe2e9cbd-1f58-45f9-af7a-bbc2221796bf" name="vlan22-if" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1948] audit: op="connection-delete" uuid="93f975fe-2181-3e88-a16c-de115f1f749a" name="Wired connection 1" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1962] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1973] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1976] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (4ddb60d7-2a3c-4aa4-9561-51cffb00bfbe)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1976] audit: op="connection-activate" uuid="4ddb60d7-2a3c-4aa4-9561-51cffb00bfbe" name="br-ex-br" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1978] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1985] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1989] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (13c6488b-9eba-4cf8-953e-4c13e4705605)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1991] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.1996] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2000] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (1045c5c4-a78d-40b3-8826-cb9933e50aab)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2002] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2008] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2013] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (cb27fa5c-e07f-493f-8997-0129d5ca9f4d)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2014] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2021] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2025] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (f4d91186-8b58-4d63-adfd-dd5722e9b835)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2027] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2034] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2037] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (61eb692f-aa48-43ca-aa37-ad3fa95b4c5b)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2038] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2041] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2042] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2049] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2053] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2057] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (c342017d-ca50-45e6-93f0-ae90b33fdcef)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2058] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2061] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2062] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2063] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2065] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2076] device (eth1): disconnecting for new activation request.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2077] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2079] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2081] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2082] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2085] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2089] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2093] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (1e55f780-b484-4e60-a006-55c2c35ecc05)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2094] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2097] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2098] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2099] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2102] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2106] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2110] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (f970735e-bc09-4f31-b1a0-4eda24e9d39f)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2111] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2114] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2115] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2116] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2119] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2123] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2127] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (fe2e9cbd-1f58-45f9-af7a-bbc2221796bf)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2128] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2131] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2132] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2134] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2135] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2152] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2154] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2157] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2158] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2166] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2171] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2185] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2191] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2193] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2198] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 kernel: ovs-system: entered promiscuous mode
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2202] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2205] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2206] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2210] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2213] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2216] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2218] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2223] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2228] dhcp4 (eth0): canceled DHCP transaction
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2228] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2228] dhcp4 (eth0): state changed no lease
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2230] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec  5 06:40:45 np0005546909 kernel: Timeout policy base is empty
Dec  5 06:40:45 np0005546909 systemd-udevd[58470]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2243] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2247] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58466 uid=0 result="fail" reason="Device is not activated"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2254] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2300] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2308] dhcp4 (eth0): state changed new lease, address=38.102.83.241
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2314] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec  5 06:40:45 np0005546909 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2369] device (eth1): disconnecting for new activation request.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2371] audit: op="connection-activate" uuid="67f3aebf-819d-5f9b-8650-6c559580f88c" name="ci-private-network" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2395] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58466 uid=0 result="success"
Dec  5 06:40:45 np0005546909 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2534] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec  5 06:40:45 np0005546909 kernel: br-ex: entered promiscuous mode
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2650] device (eth1): Activation: starting connection 'ci-private-network' (67f3aebf-819d-5f9b-8650-6c559580f88c)
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2655] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2673] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2680] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2687] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2693] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2704] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2706] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2708] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 kernel: vlan22: entered promiscuous mode
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2717] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2718] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2725] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 systemd-udevd[58472]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2735] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2740] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2746] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2751] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2757] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2764] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2770] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2777] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec  5 06:40:45 np0005546909 kernel: vlan20: entered promiscuous mode
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2784] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 systemd-udevd[58471]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2806] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2823] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec  5 06:40:45 np0005546909 kernel: vlan21: entered promiscuous mode
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2835] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2845] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec  5 06:40:45 np0005546909 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2884] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2892] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2902] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2909] device (eth1): Activation: successful, device activated.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2918] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2924] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2930] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2937] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.2952] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3009] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3009] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3013] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3016] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3021] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3045] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3053] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3097] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3099] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3103] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3108] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3114] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  5 06:40:45 np0005546909 NetworkManager[55691]: <info>  [1764934845.3119] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  5 06:40:46 np0005546909 NetworkManager[55691]: <info>  [1764934846.4765] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58466 uid=0 result="success"
Dec  5 06:40:46 np0005546909 NetworkManager[55691]: <info>  [1764934846.6696] checkpoint[0x55a7ba69d950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec  5 06:40:46 np0005546909 NetworkManager[55691]: <info>  [1764934846.6698] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58466 uid=0 result="success"
Dec  5 06:40:46 np0005546909 NetworkManager[55691]: <info>  [1764934846.9305] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58466 uid=0 result="success"
Dec  5 06:40:46 np0005546909 NetworkManager[55691]: <info>  [1764934846.9318] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58466 uid=0 result="success"
Dec  5 06:40:47 np0005546909 python3.9[58800]: ansible-ansible.legacy.async_status Invoked with jid=j388437315305.58460 mode=status _async_dir=/root/.ansible_async
Dec  5 06:40:47 np0005546909 NetworkManager[55691]: <info>  [1764934847.3233] audit: op="networking-control" arg="global-dns-configuration" pid=58466 uid=0 result="success"
Dec  5 06:40:47 np0005546909 NetworkManager[55691]: <info>  [1764934847.3258] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec  5 06:40:47 np0005546909 NetworkManager[55691]: <info>  [1764934847.3286] audit: op="networking-control" arg="global-dns-configuration" pid=58466 uid=0 result="success"
Dec  5 06:40:47 np0005546909 NetworkManager[55691]: <info>  [1764934847.3317] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58466 uid=0 result="success"
Dec  5 06:40:47 np0005546909 NetworkManager[55691]: <info>  [1764934847.4534] checkpoint[0x55a7ba69da20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec  5 06:40:47 np0005546909 NetworkManager[55691]: <info>  [1764934847.4538] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58466 uid=0 result="success"
Dec  5 06:40:47 np0005546909 ansible-async_wrapper.py[58464]: Module complete (58464)
Dec  5 06:40:48 np0005546909 ansible-async_wrapper.py[58463]: Done in kid B.
Dec  5 06:40:50 np0005546909 python3.9[58906]: ansible-ansible.legacy.async_status Invoked with jid=j388437315305.58460 mode=status _async_dir=/root/.ansible_async
Dec  5 06:40:51 np0005546909 python3.9[59005]: ansible-ansible.legacy.async_status Invoked with jid=j388437315305.58460 mode=cleanup _async_dir=/root/.ansible_async
Dec  5 06:40:52 np0005546909 python3.9[59157]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:40:52 np0005546909 python3.9[59280]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934851.6832664-322-215790950088430/.source.returncode _original_basename=.6z5k1ypq follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:40:53 np0005546909 python3.9[59432]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:40:54 np0005546909 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  5 06:40:54 np0005546909 python3.9[59556]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934853.1310487-338-47066533708795/.source.cfg _original_basename=.0dxhckwd follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:40:55 np0005546909 python3.9[59710]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:40:55 np0005546909 systemd[1]: Reloading Network Manager...
Dec  5 06:40:55 np0005546909 NetworkManager[55691]: <info>  [1764934855.0926] audit: op="reload" arg="0" pid=59714 uid=0 result="success"
Dec  5 06:40:55 np0005546909 NetworkManager[55691]: <info>  [1764934855.0932] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec  5 06:40:55 np0005546909 systemd[1]: Reloaded Network Manager.
Dec  5 06:40:55 np0005546909 systemd-logind[792]: Session 11 logged out. Waiting for processes to exit.
Dec  5 06:40:55 np0005546909 systemd[1]: session-11.scope: Deactivated successfully.
Dec  5 06:40:55 np0005546909 systemd[1]: session-11.scope: Consumed 52.823s CPU time.
Dec  5 06:40:55 np0005546909 systemd-logind[792]: Removed session 11.
Dec  5 06:41:00 np0005546909 systemd-logind[792]: New session 12 of user zuul.
Dec  5 06:41:00 np0005546909 systemd[1]: Started Session 12 of User zuul.
Dec  5 06:41:01 np0005546909 python3.9[59898]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:41:03 np0005546909 python3.9[60052]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:41:04 np0005546909 python3.9[60242]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:41:04 np0005546909 systemd[1]: session-12.scope: Deactivated successfully.
Dec  5 06:41:04 np0005546909 systemd[1]: session-12.scope: Consumed 2.531s CPU time.
Dec  5 06:41:04 np0005546909 systemd-logind[792]: Session 12 logged out. Waiting for processes to exit.
Dec  5 06:41:04 np0005546909 systemd-logind[792]: Removed session 12.
Dec  5 06:41:05 np0005546909 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  5 06:41:09 np0005546909 systemd-logind[792]: New session 13 of user zuul.
Dec  5 06:41:09 np0005546909 systemd[1]: Started Session 13 of User zuul.
Dec  5 06:41:10 np0005546909 python3.9[60424]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:41:11 np0005546909 python3.9[60578]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:41:12 np0005546909 python3.9[60734]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:41:13 np0005546909 python3.9[60819]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:41:15 np0005546909 python3.9[60972]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:41:17 np0005546909 python3.9[61164]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:41:17 np0005546909 python3.9[61316]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:41:17 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:41:18 np0005546909 python3.9[61480]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:41:19 np0005546909 python3.9[61558]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:41:19 np0005546909 python3.9[61710]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:41:20 np0005546909 python3.9[61788]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:41:21 np0005546909 python3.9[61940]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:41:21 np0005546909 python3.9[62092]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:41:22 np0005546909 python3.9[62244]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:41:23 np0005546909 python3.9[62396]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:41:23 np0005546909 python3.9[62548]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:41:26 np0005546909 python3.9[62701]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:41:26 np0005546909 python3.9[62855]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:41:27 np0005546909 python3.9[63007]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:41:28 np0005546909 python3.9[63159]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:41:29 np0005546909 python3.9[63312]: ansible-service_facts Invoked
Dec  5 06:41:29 np0005546909 network[63329]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 06:41:29 np0005546909 network[63330]: 'network-scripts' will be removed from distribution in near future.
Dec  5 06:41:29 np0005546909 network[63331]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 06:41:34 np0005546909 python3.9[63784]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:41:36 np0005546909 python3.9[63937]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  5 06:41:37 np0005546909 python3.9[64090]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:41:38 np0005546909 python3.9[64215]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934897.2952838-232-41830739616004/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:41:39 np0005546909 python3.9[64369]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:41:39 np0005546909 python3.9[64494]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934898.8885202-247-179010942012333/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:41:41 np0005546909 python3.9[64648]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:41:42 np0005546909 python3.9[64802]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:41:43 np0005546909 python3.9[64886]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:41:44 np0005546909 python3.9[65040]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:41:45 np0005546909 python3.9[65124]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:41:45 np0005546909 chronyd[781]: chronyd exiting
Dec  5 06:41:45 np0005546909 systemd[1]: Stopping NTP client/server...
Dec  5 06:41:45 np0005546909 systemd[1]: chronyd.service: Deactivated successfully.
Dec  5 06:41:45 np0005546909 systemd[1]: Stopped NTP client/server.
Dec  5 06:41:45 np0005546909 systemd[1]: Starting NTP client/server...
Dec  5 06:41:45 np0005546909 chronyd[65133]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  5 06:41:45 np0005546909 chronyd[65133]: Frequency -28.217 +/- 0.157 ppm read from /var/lib/chrony/drift
Dec  5 06:41:45 np0005546909 chronyd[65133]: Loaded seccomp filter (level 2)
Dec  5 06:41:45 np0005546909 systemd[1]: Started NTP client/server.
Dec  5 06:41:45 np0005546909 systemd[1]: session-13.scope: Deactivated successfully.
Dec  5 06:41:45 np0005546909 systemd[1]: session-13.scope: Consumed 26.167s CPU time.
Dec  5 06:41:45 np0005546909 systemd-logind[792]: Session 13 logged out. Waiting for processes to exit.
Dec  5 06:41:45 np0005546909 systemd-logind[792]: Removed session 13.
Dec  5 06:41:51 np0005546909 systemd-logind[792]: New session 14 of user zuul.
Dec  5 06:41:51 np0005546909 systemd[1]: Started Session 14 of User zuul.
Dec  5 06:41:52 np0005546909 python3.9[65312]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:41:53 np0005546909 python3.9[65468]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:41:54 np0005546909 python3.9[65643]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:41:54 np0005546909 python3.9[65721]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.u26zqskb recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:41:55 np0005546909 python3.9[65873]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:41:56 np0005546909 python3.9[65996]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934915.2488701-61-19703514054571/.source _original_basename=.3n7_alwn follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:41:57 np0005546909 python3.9[66148]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:41:57 np0005546909 python3.9[66300]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:41:58 np0005546909 python3.9[66423]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764934917.1844738-85-18260389674718/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:41:59 np0005546909 python3.9[66575]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:41:59 np0005546909 python3.9[66698]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764934918.5976343-85-127597446393764/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:42:00 np0005546909 python3.9[66850]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:01 np0005546909 python3.9[67002]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:01 np0005546909 python3.9[67125]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934920.685287-122-259358495079560/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:02 np0005546909 python3.9[67277]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:03 np0005546909 python3.9[67400]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934921.9063-137-141056654860834/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:04 np0005546909 python3.9[67552]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:42:04 np0005546909 systemd[1]: Reloading.
Dec  5 06:42:04 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:42:04 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:42:04 np0005546909 systemd[1]: Reloading.
Dec  5 06:42:04 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:42:04 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:42:04 np0005546909 systemd[1]: Starting EDPM Container Shutdown...
Dec  5 06:42:04 np0005546909 systemd[1]: Finished EDPM Container Shutdown.
Dec  5 06:42:05 np0005546909 python3.9[67780]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:05 np0005546909 python3.9[67903]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934924.8423848-160-269010773904548/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:06 np0005546909 python3.9[68055]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:07 np0005546909 python3.9[68178]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934926.0552874-175-49225559708041/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:07 np0005546909 python3.9[68330]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:42:07 np0005546909 systemd[1]: Reloading.
Dec  5 06:42:08 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:42:08 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:42:08 np0005546909 systemd[1]: Reloading.
Dec  5 06:42:08 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:42:08 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:42:08 np0005546909 systemd[1]: Starting Create netns directory...
Dec  5 06:42:08 np0005546909 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  5 06:42:08 np0005546909 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  5 06:42:08 np0005546909 systemd[1]: Finished Create netns directory.
Dec  5 06:42:09 np0005546909 python3.9[68556]: ansible-ansible.builtin.service_facts Invoked
Dec  5 06:42:09 np0005546909 network[68573]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 06:42:09 np0005546909 network[68574]: 'network-scripts' will be removed from distribution in near future.
Dec  5 06:42:09 np0005546909 network[68575]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 06:42:12 np0005546909 python3.9[68837]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:42:12 np0005546909 systemd[1]: Reloading.
Dec  5 06:42:12 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:42:12 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:42:13 np0005546909 systemd[1]: Stopping IPv4 firewall with iptables...
Dec  5 06:42:13 np0005546909 iptables.init[68879]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec  5 06:42:13 np0005546909 iptables.init[68879]: iptables: Flushing firewall rules: [  OK  ]
Dec  5 06:42:13 np0005546909 systemd[1]: iptables.service: Deactivated successfully.
Dec  5 06:42:13 np0005546909 systemd[1]: Stopped IPv4 firewall with iptables.
Dec  5 06:42:14 np0005546909 python3.9[69075]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:42:15 np0005546909 python3.9[69229]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:42:15 np0005546909 systemd[1]: Reloading.
Dec  5 06:42:15 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:42:15 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:42:15 np0005546909 systemd[1]: Starting Netfilter Tables...
Dec  5 06:42:15 np0005546909 systemd[1]: Finished Netfilter Tables.
Dec  5 06:42:16 np0005546909 python3.9[69421]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:42:17 np0005546909 python3.9[69574]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:17 np0005546909 python3.9[69699]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934936.8009057-244-23077902056028/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:18 np0005546909 python3.9[69852]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:42:18 np0005546909 systemd[1]: Reloading OpenSSH server daemon...
Dec  5 06:42:18 np0005546909 systemd[1]: Reloaded OpenSSH server daemon.
Dec  5 06:42:19 np0005546909 python3.9[70008]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:19 np0005546909 python3.9[70160]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:20 np0005546909 python3.9[70283]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934939.5496688-275-172429782715528/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:21 np0005546909 python3.9[70435]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  5 06:42:21 np0005546909 systemd[1]: Starting Time & Date Service...
Dec  5 06:42:21 np0005546909 systemd[1]: Started Time & Date Service.
Dec  5 06:42:22 np0005546909 python3.9[70591]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:22 np0005546909 python3.9[70743]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:23 np0005546909 python3.9[70866]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934942.4963765-310-225825422247797/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:24 np0005546909 python3.9[71018]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:24 np0005546909 python3.9[71141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764934943.6060627-325-172707249628996/.source.yaml _original_basename=.6qka8te6 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:25 np0005546909 python3.9[71293]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:25 np0005546909 python3.9[71416]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934944.9808588-340-170395583055235/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:26 np0005546909 python3.9[71568]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:42:27 np0005546909 python3.9[71721]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:42:28 np0005546909 python3[71874]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  5 06:42:29 np0005546909 python3.9[72026]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:29 np0005546909 python3.9[72149]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934948.5632546-379-257552289614687/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:30 np0005546909 python3.9[72301]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:30 np0005546909 python3.9[72424]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934949.821446-394-104587280866089/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:31 np0005546909 python3.9[72576]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:32 np0005546909 python3.9[72699]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934951.1600826-409-43989239146490/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:32 np0005546909 python3.9[72851]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:33 np0005546909 python3.9[72974]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934952.314305-424-183097609432172/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:34 np0005546909 python3.9[73126]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:42:34 np0005546909 python3.9[73249]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764934953.4934638-439-248623026112351/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:35 np0005546909 python3.9[73401]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:36 np0005546909 python3.9[73553]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:42:37 np0005546909 python3.9[73712]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:38 np0005546909 python3.9[73865]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:38 np0005546909 python3.9[74017]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:39 np0005546909 python3.9[74169]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  5 06:42:39 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 06:42:39 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 06:42:40 np0005546909 python3.9[74323]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  5 06:42:40 np0005546909 systemd[1]: session-14.scope: Deactivated successfully.
Dec  5 06:42:40 np0005546909 systemd[1]: session-14.scope: Consumed 35.779s CPU time.
Dec  5 06:42:40 np0005546909 systemd-logind[792]: Session 14 logged out. Waiting for processes to exit.
Dec  5 06:42:40 np0005546909 systemd-logind[792]: Removed session 14.
Dec  5 06:42:45 np0005546909 systemd-logind[792]: New session 15 of user zuul.
Dec  5 06:42:45 np0005546909 systemd[1]: Started Session 15 of User zuul.
Dec  5 06:42:46 np0005546909 python3.9[74504]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  5 06:42:47 np0005546909 python3.9[74656]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:42:48 np0005546909 python3.9[74810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:42:49 np0005546909 python3.9[74963]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDOOJVLNbUbXdj/4hnrYG4cBJ/XBhnOoBFKwcsgC1GK+qApIMq4AN9kvFcb69ro6VcgRjtYliCnG5TgVk5pEiS1s8BUyor5fl0SA7FEpF0wOrdG4O4svVl67EKJIrumCPiiYqOkFmxa0uzeWPYlZ9KqmqEFUnfIiBTd3g6oXgX3xUSLM5zhum9rmbo9Wyct2IWSctskbdxSj61pQz84UKinUZfbFbt19R+7hSrz0o7kIRXpX5+BscttG0pmvh21pzIH9KboW12wgqdcPLZTCL4ZLUUBWBzaSkzpbPxeyg1EbbhJyVwuLVOeuFOUblr0KGmQRtOK1XN/BaC0kpkyJEKbqz3tih8cv6n8Fu/lKoNaEratukKNtnRn1v+UD25/a3bMr2Nap67lLNSKPb8hVksVc8I5GfqPl6mpDnsUffi6+2DGh/lXs79VfAJHNhMDo947VU8ntcLR7oebE+e1CWNvLHDxVg2UFD2KjXGUogS+6G2FC7a7LJCHjL+Ul7C0SGE=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHRqsyTsoXMhvSezBrIRTqtqwVV7Nl5EvVW1GsgltpO/#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNPaTPF1sw2zD0sNQJEE1DG4TP9pcJarMUZH8Q9jzRRo4RTGVHJcz3S0FZ2fsO8PHdQzacHi17HUsogXh1a47/E=#012 create=True mode=0644 path=/tmp/ansible.gfdz24ir state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:50 np0005546909 python3.9[75115]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.gfdz24ir' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:42:50 np0005546909 python3.9[75269]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.gfdz24ir state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:42:51 np0005546909 systemd[1]: session-15.scope: Deactivated successfully.
Dec  5 06:42:51 np0005546909 systemd[1]: session-15.scope: Consumed 3.365s CPU time.
Dec  5 06:42:51 np0005546909 systemd-logind[792]: Session 15 logged out. Waiting for processes to exit.
Dec  5 06:42:51 np0005546909 systemd-logind[792]: Removed session 15.
Dec  5 06:42:51 np0005546909 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  5 06:42:57 np0005546909 systemd-logind[792]: New session 16 of user zuul.
Dec  5 06:42:57 np0005546909 systemd[1]: Started Session 16 of User zuul.
Dec  5 06:42:58 np0005546909 python3.9[75449]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:42:59 np0005546909 python3.9[75605]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  5 06:43:00 np0005546909 python3.9[75759]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:43:01 np0005546909 python3.9[75912]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:43:02 np0005546909 python3.9[76065]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:43:03 np0005546909 python3.9[76219]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:43:03 np0005546909 python3.9[76374]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:04 np0005546909 systemd[1]: session-16.scope: Deactivated successfully.
Dec  5 06:43:04 np0005546909 systemd[1]: session-16.scope: Consumed 4.652s CPU time.
Dec  5 06:43:04 np0005546909 systemd-logind[792]: Session 16 logged out. Waiting for processes to exit.
Dec  5 06:43:04 np0005546909 systemd-logind[792]: Removed session 16.
Dec  5 06:43:10 np0005546909 systemd-logind[792]: New session 17 of user zuul.
Dec  5 06:43:10 np0005546909 systemd[1]: Started Session 17 of User zuul.
Dec  5 06:43:11 np0005546909 python3.9[76552]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:43:12 np0005546909 python3.9[76708]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:43:12 np0005546909 python3.9[76792]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  5 06:43:15 np0005546909 python3.9[76943]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:43:16 np0005546909 python3.9[77094]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  5 06:43:17 np0005546909 python3.9[77244]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:43:17 np0005546909 python3.9[77394]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:43:18 np0005546909 systemd[1]: session-17.scope: Deactivated successfully.
Dec  5 06:43:18 np0005546909 systemd[1]: session-17.scope: Consumed 5.914s CPU time.
Dec  5 06:43:18 np0005546909 systemd-logind[792]: Session 17 logged out. Waiting for processes to exit.
Dec  5 06:43:18 np0005546909 systemd-logind[792]: Removed session 17.
Dec  5 06:43:23 np0005546909 systemd-logind[792]: New session 18 of user zuul.
Dec  5 06:43:24 np0005546909 systemd[1]: Started Session 18 of User zuul.
Dec  5 06:43:24 np0005546909 python3.9[77572]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:43:26 np0005546909 python3.9[77728]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:27 np0005546909 python3.9[77880]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:28 np0005546909 python3.9[78032]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:28 np0005546909 python3.9[78155]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935007.4767478-65-235229637528272/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=00d5fa7b0776dc8691dd5500aa71eb90319347cd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:29 np0005546909 python3.9[78307]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:29 np0005546909 python3.9[78430]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935008.9395278-65-231967888147611/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=0a4f92add1239ac937855979d4dc1394f101fc06 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:30 np0005546909 python3.9[78582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:31 np0005546909 python3.9[78705]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935010.1437294-65-168593357188580/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=fa258e12043967341bc94c488e45ada802ba7070 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:31 np0005546909 python3.9[78857]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:32 np0005546909 python3.9[79009]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:33 np0005546909 python3.9[79161]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:34 np0005546909 python3.9[79284]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935012.9734132-124-167366774484041/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=bee7f316921a290d1ed3611022ac3ada626797f9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:34 np0005546909 python3.9[79436]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:35 np0005546909 python3.9[79559]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935014.3690202-124-244652408961771/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=97dce75f31f2e462f65a5480df34ce7f592c6afa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:36 np0005546909 python3.9[79711]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:36 np0005546909 python3.9[79834]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935015.5865023-124-66794612019804/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=0a40c100a83cd94b03805d76969b458dfcb20500 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:37 np0005546909 python3.9[79986]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:38 np0005546909 python3.9[80138]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:38 np0005546909 python3.9[80290]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:39 np0005546909 python3.9[80413]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935018.2274585-183-235252584515357/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=9b2aaaa14407d7741dbcde8ff451c94476d44925 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:39 np0005546909 python3.9[80565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:40 np0005546909 python3.9[80688]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935019.4720778-183-52161796388886/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=966f06ea34331b21eb0c32c6650a16517e61449e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:41 np0005546909 python3.9[80840]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:41 np0005546909 python3.9[80963]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935020.5904458-183-178846736807947/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=805083cb4c57119c8e2c771d0c5426076bb5774c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:42 np0005546909 python3.9[81115]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:43 np0005546909 python3.9[81267]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:43 np0005546909 python3.9[81419]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:44 np0005546909 python3.9[81542]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935023.2261763-242-41129542772871/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=57d2215fb64641d772369ce08602177c050bbe39 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:44 np0005546909 python3.9[81694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:45 np0005546909 python3.9[81817]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935024.3339856-242-59466079919143/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=966f06ea34331b21eb0c32c6650a16517e61449e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:45 np0005546909 python3.9[81969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:46 np0005546909 python3.9[82092]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935025.4169595-242-167977443450319/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=8a3533d254239bf4ebc081bbaae4dd07fea0d059 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:47 np0005546909 python3.9[82246]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:48 np0005546909 python3.9[82398]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:48 np0005546909 python3.9[82521]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935027.8516166-310-240948883901478/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:49 np0005546909 python3.9[82674]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:50 np0005546909 python3.9[82827]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:50 np0005546909 python3.9[82950]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935029.7569757-334-115794623404925/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:51 np0005546909 python3.9[83102]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:52 np0005546909 python3.9[83256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:52 np0005546909 python3.9[83379]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935031.7262492-358-46087594749467/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:53 np0005546909 python3.9[83531]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:54 np0005546909 python3.9[83685]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:54 np0005546909 python3.9[83808]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935033.6556215-382-141298749513856/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:55 np0005546909 python3.9[83960]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:55 np0005546909 chronyd[65133]: Selected source 138.197.135.239 (pool.ntp.org)
Dec  5 06:43:56 np0005546909 python3.9[84112]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:56 np0005546909 python3.9[84237]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935035.7788403-406-177490227712518/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:57 np0005546909 python3.9[84389]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:43:58 np0005546909 python3.9[84541]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:43:58 np0005546909 python3.9[84664]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935037.797954-430-32997905733313/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:43:59 np0005546909 python3.9[84816]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:44:00 np0005546909 python3.9[84968]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:01 np0005546909 python3.9[85091]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935039.968499-454-281143613470509/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=335c8bf572ed4d0f66556a4c88e62f0503318580 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:01 np0005546909 systemd-logind[792]: Session 18 logged out. Waiting for processes to exit.
Dec  5 06:44:01 np0005546909 systemd[1]: session-18.scope: Deactivated successfully.
Dec  5 06:44:01 np0005546909 systemd[1]: session-18.scope: Consumed 29.093s CPU time.
Dec  5 06:44:01 np0005546909 systemd-logind[792]: Removed session 18.
Dec  5 06:44:06 np0005546909 systemd-logind[792]: New session 19 of user zuul.
Dec  5 06:44:06 np0005546909 systemd[1]: Started Session 19 of User zuul.
Dec  5 06:44:07 np0005546909 python3.9[85269]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:44:08 np0005546909 python3.9[85425]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:44:09 np0005546909 python3.9[85577]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:44:10 np0005546909 python3.9[85727]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:44:11 np0005546909 python3.9[85879]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  5 06:44:13 np0005546909 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec  5 06:44:13 np0005546909 python3.9[86035]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:44:14 np0005546909 python3.9[86119]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:44:16 np0005546909 python3.9[86272]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 06:44:17 np0005546909 python3[86427]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec  5 06:44:18 np0005546909 python3.9[86579]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:18 np0005546909 python3.9[86731]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:19 np0005546909 python3.9[86809]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:20 np0005546909 python3.9[86961]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:20 np0005546909 python3.9[87039]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.g67b295g recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:21 np0005546909 python3.9[87191]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:21 np0005546909 python3.9[87269]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:22 np0005546909 python3.9[87421]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:44:23 np0005546909 python3[87574]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  5 06:44:24 np0005546909 python3.9[87726]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:24 np0005546909 python3.9[87851]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935063.4905515-157-124107120358979/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:25 np0005546909 python3.9[88003]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:26 np0005546909 python3.9[88128]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935064.9747233-172-142704256555088/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:26 np0005546909 python3.9[88280]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:27 np0005546909 python3.9[88405]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935066.3137581-187-279013907552183/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:28 np0005546909 python3.9[88557]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:28 np0005546909 python3.9[88682]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935067.5185306-202-69376180708121/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:29 np0005546909 python3.9[88834]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:30 np0005546909 python3.9[88959]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935068.7542396-217-212900420981031/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:30 np0005546909 python3.9[89111]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:31 np0005546909 python3.9[89263]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:44:32 np0005546909 python3.9[89418]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:33 np0005546909 python3.9[89570]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:44:33 np0005546909 python3.9[89723]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:44:34 np0005546909 python3.9[89877]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:44:35 np0005546909 python3.9[90032]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:36 np0005546909 python3.9[90182]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:44:37 np0005546909 python3.9[90335]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:44:37 np0005546909 ovs-vsctl[90336]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec  5 06:44:38 np0005546909 python3.9[90488]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:44:38 np0005546909 python3.9[90643]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:44:38 np0005546909 ovs-vsctl[90644]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec  5 06:44:39 np0005546909 python3.9[90794]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:44:40 np0005546909 python3.9[90948]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:44:40 np0005546909 python3.9[91100]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:41 np0005546909 python3.9[91178]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:44:42 np0005546909 python3.9[91331]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:42 np0005546909 python3.9[91409]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:44:43 np0005546909 python3.9[91561]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:43 np0005546909 python3.9[91713]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:44 np0005546909 python3.9[91791]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:44 np0005546909 python3.9[91943]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:45 np0005546909 python3.9[92021]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:46 np0005546909 python3.9[92173]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:44:46 np0005546909 systemd[1]: Reloading.
Dec  5 06:44:46 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:44:46 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:44:47 np0005546909 python3.9[92361]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:47 np0005546909 python3.9[92439]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:48 np0005546909 python3.9[92591]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:49 np0005546909 python3.9[92669]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:50 np0005546909 python3.9[92821]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:44:50 np0005546909 systemd[1]: Reloading.
Dec  5 06:44:50 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:44:50 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:44:50 np0005546909 systemd[1]: Starting Create netns directory...
Dec  5 06:44:50 np0005546909 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  5 06:44:50 np0005546909 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  5 06:44:50 np0005546909 systemd[1]: Finished Create netns directory.
Dec  5 06:44:51 np0005546909 python3.9[93014]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:44:51 np0005546909 python3.9[93166]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:52 np0005546909 python3.9[93289]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935091.2254436-468-69485999255828/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:44:53 np0005546909 python3.9[93441]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:44:53 np0005546909 python3.9[93593]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:44:54 np0005546909 python3.9[93716]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935093.3306-493-210654007584453/.source.json _original_basename=.oqg518va follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:55 np0005546909 python3.9[93868]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:44:57 np0005546909 python3.9[94295]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec  5 06:44:58 np0005546909 python3.9[94447]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 06:44:59 np0005546909 python3.9[94599]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  5 06:44:59 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:45:00 np0005546909 python3[94761]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 06:45:00 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:45:00 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:45:00 np0005546909 podman[94796]: 2025-12-05 11:45:00.694457544 +0000 UTC m=+0.054561807 container create 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  5 06:45:00 np0005546909 podman[94796]: 2025-12-05 11:45:00.667372357 +0000 UTC m=+0.027476600 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  5 06:45:00 np0005546909 python3[94761]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  5 06:45:01 np0005546909 python3.9[94986]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:45:01 np0005546909 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  5 06:45:02 np0005546909 python3.9[95140]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:45:02 np0005546909 python3.9[95216]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:45:03 np0005546909 python3.9[95367]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935102.7339227-581-30475193930869/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:45:03 np0005546909 python3.9[95443]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:45:03 np0005546909 systemd[1]: Reloading.
Dec  5 06:45:04 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:45:04 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:45:04 np0005546909 python3.9[95554]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:45:04 np0005546909 systemd[1]: Reloading.
Dec  5 06:45:04 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:45:04 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:45:05 np0005546909 systemd[1]: Starting ovn_controller container...
Dec  5 06:45:05 np0005546909 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec  5 06:45:05 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:45:05 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10948cb30dd006528842e9fd92fe4b809ecc45dabd60697bd859f375e9a51494/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  5 06:45:05 np0005546909 systemd[1]: Started /usr/bin/podman healthcheck run 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698.
Dec  5 06:45:05 np0005546909 podman[95595]: 2025-12-05 11:45:05.702916139 +0000 UTC m=+0.631561113 container init 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 06:45:05 np0005546909 ovn_controller[95610]: + sudo -E kolla_set_configs
Dec  5 06:45:05 np0005546909 podman[95595]: 2025-12-05 11:45:05.736542984 +0000 UTC m=+0.665187918 container start 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 06:45:05 np0005546909 edpm-start-podman-container[95595]: ovn_controller
Dec  5 06:45:05 np0005546909 systemd[1]: Created slice User Slice of UID 0.
Dec  5 06:45:05 np0005546909 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec  5 06:45:05 np0005546909 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec  5 06:45:05 np0005546909 systemd[1]: Starting User Manager for UID 0...
Dec  5 06:45:05 np0005546909 edpm-start-podman-container[95594]: Creating additional drop-in dependency for "ovn_controller" (6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698)
Dec  5 06:45:05 np0005546909 podman[95616]: 2025-12-05 11:45:05.818176387 +0000 UTC m=+0.070562106 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2)
Dec  5 06:45:05 np0005546909 systemd[1]: Reloading.
Dec  5 06:45:05 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:45:05 np0005546909 systemd[95652]: Queued start job for default target Main User Target.
Dec  5 06:45:05 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:45:05 np0005546909 systemd[95652]: Created slice User Application Slice.
Dec  5 06:45:05 np0005546909 systemd[95652]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec  5 06:45:05 np0005546909 systemd[95652]: Started Daily Cleanup of User's Temporary Directories.
Dec  5 06:45:05 np0005546909 systemd[95652]: Reached target Paths.
Dec  5 06:45:05 np0005546909 systemd[95652]: Reached target Timers.
Dec  5 06:45:05 np0005546909 systemd[95652]: Starting D-Bus User Message Bus Socket...
Dec  5 06:45:05 np0005546909 systemd[95652]: Starting Create User's Volatile Files and Directories...
Dec  5 06:45:05 np0005546909 systemd[95652]: Finished Create User's Volatile Files and Directories.
Dec  5 06:45:05 np0005546909 systemd[95652]: Listening on D-Bus User Message Bus Socket.
Dec  5 06:45:05 np0005546909 systemd[95652]: Reached target Sockets.
Dec  5 06:45:05 np0005546909 systemd[95652]: Reached target Basic System.
Dec  5 06:45:05 np0005546909 systemd[95652]: Reached target Main User Target.
Dec  5 06:45:05 np0005546909 systemd[95652]: Startup finished in 129ms.
Dec  5 06:45:06 np0005546909 systemd[1]: Started User Manager for UID 0.
Dec  5 06:45:06 np0005546909 systemd[1]: Started ovn_controller container.
Dec  5 06:45:06 np0005546909 systemd[1]: 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698-6ce87739bdca08f5.service: Main process exited, code=exited, status=1/FAILURE
Dec  5 06:45:06 np0005546909 systemd[1]: 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698-6ce87739bdca08f5.service: Failed with result 'exit-code'.
Dec  5 06:45:06 np0005546909 systemd[1]: Started Session c1 of User root.
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: INFO:__main__:Validating config file
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: INFO:__main__:Writing out command to execute
Dec  5 06:45:06 np0005546909 systemd[1]: session-c1.scope: Deactivated successfully.
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: ++ cat /run_command
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: + ARGS=
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: + sudo kolla_copy_cacerts
Dec  5 06:45:06 np0005546909 systemd[1]: Started Session c2 of User root.
Dec  5 06:45:06 np0005546909 systemd[1]: session-c2.scope: Deactivated successfully.
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: + [[ ! -n '' ]]
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: + . kolla_extend_start
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: + umask 0022
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00010|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00011|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00013|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Dec  5 06:45:06 np0005546909 NetworkManager[55691]: <info>  [1764935106.2628] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Dec  5 06:45:06 np0005546909 NetworkManager[55691]: <info>  [1764935106.2636] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00017|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  5 06:45:06 np0005546909 NetworkManager[55691]: <info>  [1764935106.2644] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Dec  5 06:45:06 np0005546909 NetworkManager[55691]: <info>  [1764935106.2648] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Dec  5 06:45:06 np0005546909 NetworkManager[55691]: <info>  [1764935106.2651] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00018|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00019|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec  5 06:45:06 np0005546909 kernel: br-int: entered promiscuous mode
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00020|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00021|features|INFO|OVS Feature: ct_flush, state: supported
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00022|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec  5 06:45:06 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:06Z|00023|main|INFO|OVS feature set changed, force recompute.
Dec  5 06:45:06 np0005546909 systemd-udevd[95799]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:45:06 np0005546909 python3.9[95877]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:45:06 np0005546909 ovs-vsctl[95878]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00024|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00025|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00026|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00028|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00029|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00030|main|INFO|OVS feature set changed, force recompute.
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  5 06:45:07 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:07Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  5 06:45:07 np0005546909 NetworkManager[55691]: <info>  [1764935107.2687] manager: (ovn-1cd8e1-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec  5 06:45:07 np0005546909 systemd-udevd[95801]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:45:07 np0005546909 kernel: genev_sys_6081: entered promiscuous mode
Dec  5 06:45:07 np0005546909 NetworkManager[55691]: <info>  [1764935107.2847] device (genev_sys_6081): carrier: link connected
Dec  5 06:45:07 np0005546909 NetworkManager[55691]: <info>  [1764935107.2850] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Dec  5 06:45:07 np0005546909 python3.9[96030]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:45:07 np0005546909 ovs-vsctl[96035]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec  5 06:45:08 np0005546909 python3.9[96188]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:45:08 np0005546909 ovs-vsctl[96189]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec  5 06:45:08 np0005546909 systemd[1]: session-19.scope: Deactivated successfully.
Dec  5 06:45:08 np0005546909 systemd[1]: session-19.scope: Consumed 45.349s CPU time.
Dec  5 06:45:08 np0005546909 systemd-logind[792]: Session 19 logged out. Waiting for processes to exit.
Dec  5 06:45:08 np0005546909 systemd-logind[792]: Removed session 19.
Dec  5 06:45:14 np0005546909 systemd-logind[792]: New session 21 of user zuul.
Dec  5 06:45:14 np0005546909 systemd[1]: Started Session 21 of User zuul.
Dec  5 06:45:15 np0005546909 python3.9[96367]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:45:16 np0005546909 systemd[1]: Stopping User Manager for UID 0...
Dec  5 06:45:16 np0005546909 systemd[95652]: Activating special unit Exit the Session...
Dec  5 06:45:16 np0005546909 systemd[95652]: Stopped target Main User Target.
Dec  5 06:45:16 np0005546909 systemd[95652]: Stopped target Basic System.
Dec  5 06:45:16 np0005546909 systemd[95652]: Stopped target Paths.
Dec  5 06:45:16 np0005546909 systemd[95652]: Stopped target Sockets.
Dec  5 06:45:16 np0005546909 systemd[95652]: Stopped target Timers.
Dec  5 06:45:16 np0005546909 systemd[95652]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  5 06:45:16 np0005546909 systemd[95652]: Closed D-Bus User Message Bus Socket.
Dec  5 06:45:16 np0005546909 systemd[95652]: Stopped Create User's Volatile Files and Directories.
Dec  5 06:45:16 np0005546909 systemd[95652]: Removed slice User Application Slice.
Dec  5 06:45:16 np0005546909 systemd[95652]: Reached target Shutdown.
Dec  5 06:45:16 np0005546909 systemd[95652]: Finished Exit the Session.
Dec  5 06:45:16 np0005546909 systemd[95652]: Reached target Exit the Session.
Dec  5 06:45:16 np0005546909 systemd[1]: user@0.service: Deactivated successfully.
Dec  5 06:45:16 np0005546909 systemd[1]: Stopped User Manager for UID 0.
Dec  5 06:45:16 np0005546909 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec  5 06:45:16 np0005546909 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec  5 06:45:16 np0005546909 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec  5 06:45:16 np0005546909 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec  5 06:45:16 np0005546909 systemd[1]: Removed slice User Slice of UID 0.
Dec  5 06:45:16 np0005546909 python3.9[96525]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:17 np0005546909 python3.9[96677]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:18 np0005546909 python3.9[96829]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:18 np0005546909 python3.9[96981]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:19 np0005546909 python3.9[97133]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:20 np0005546909 python3.9[97283]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:45:21 np0005546909 python3.9[97435]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  5 06:45:22 np0005546909 python3.9[97585]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:23 np0005546909 python3.9[97706]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935121.9069054-86-229035617534459/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:23 np0005546909 python3.9[97856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:24 np0005546909 python3.9[97977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935123.3868709-101-215872605882899/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:25 np0005546909 python3.9[98129]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:45:26 np0005546909 python3.9[98214]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:45:28 np0005546909 python3.9[98367]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 06:45:29 np0005546909 python3.9[98520]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:29 np0005546909 python3.9[98641]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935128.8890836-138-85324542870571/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:30 np0005546909 python3.9[98791]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:31 np0005546909 python3.9[98912]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935130.0398328-138-46926094524425/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:32 np0005546909 python3.9[99062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:32 np0005546909 python3.9[99183]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935131.7976916-182-216134272145339/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:33 np0005546909 python3.9[99333]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:33 np0005546909 python3.9[99454]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935132.9427874-182-70459949224904/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:34 np0005546909 python3.9[99604]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:45:35 np0005546909 python3.9[99758]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:35 np0005546909 python3.9[99910]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:36 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:36Z|00031|memory|INFO|16000 kB peak resident set size after 30.0 seconds
Dec  5 06:45:36 np0005546909 ovn_controller[95610]: 2025-12-05T11:45:36Z|00032|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Dec  5 06:45:36 np0005546909 podman[99960]: 2025-12-05 11:45:36.246268129 +0000 UTC m=+0.104535092 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Dec  5 06:45:36 np0005546909 python3.9[100005]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:37 np0005546909 python3.9[100164]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:37 np0005546909 python3.9[100242]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:38 np0005546909 python3.9[100394]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:45:38 np0005546909 python3.9[100546]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:39 np0005546909 python3.9[100624]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:45:40 np0005546909 python3.9[100776]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:40 np0005546909 python3.9[100854]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:45:41 np0005546909 python3.9[101006]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:45:41 np0005546909 systemd[1]: Reloading.
Dec  5 06:45:41 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:45:41 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:45:42 np0005546909 python3.9[101196]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:42 np0005546909 python3.9[101274]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:45:43 np0005546909 python3.9[101426]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:43 np0005546909 python3.9[101504]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:45:44 np0005546909 python3.9[101656]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:45:44 np0005546909 systemd[1]: Reloading.
Dec  5 06:45:44 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:45:44 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:45:44 np0005546909 systemd[1]: Starting Create netns directory...
Dec  5 06:45:44 np0005546909 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  5 06:45:44 np0005546909 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  5 06:45:44 np0005546909 systemd[1]: Finished Create netns directory.
Dec  5 06:45:45 np0005546909 python3.9[101849]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:46 np0005546909 python3.9[102001]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:47 np0005546909 python3.9[102124]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935146.007944-333-192737364057845/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:48 np0005546909 python3.9[102276]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:45:49 np0005546909 python3.9[102429]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:45:49 np0005546909 python3.9[102552]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935148.3624232-358-257335484211806/.source.json _original_basename=.one4b8ki follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:45:50 np0005546909 python3.9[102704]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:45:52 np0005546909 python3.9[103131]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec  5 06:45:53 np0005546909 python3.9[103283]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 06:45:54 np0005546909 python3.9[103435]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  5 06:45:56 np0005546909 python3[103613]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 06:45:56 np0005546909 podman[103650]: 2025-12-05 11:45:56.413740844 +0000 UTC m=+0.048166884 container create de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Dec  5 06:45:56 np0005546909 podman[103650]: 2025-12-05 11:45:56.390098605 +0000 UTC m=+0.024524675 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 06:45:56 np0005546909 python3[103613]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 06:45:57 np0005546909 python3.9[103840]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:45:57 np0005546909 python3.9[103994]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:45:58 np0005546909 python3.9[104070]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:45:59 np0005546909 python3.9[104221]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935158.3779027-446-33475902378353/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:45:59 np0005546909 python3.9[104297]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:45:59 np0005546909 systemd[1]: Reloading.
Dec  5 06:45:59 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:45:59 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:46:00 np0005546909 python3.9[104407]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:46:00 np0005546909 systemd[1]: Reloading.
Dec  5 06:46:00 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:46:00 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:46:00 np0005546909 systemd[1]: Starting ovn_metadata_agent container...
Dec  5 06:46:01 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:46:01 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/436161f6e5b01ff201ecbc4abee31e21a170e03160630140dbf046e60a098b2e/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec  5 06:46:01 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/436161f6e5b01ff201ecbc4abee31e21a170e03160630140dbf046e60a098b2e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 06:46:01 np0005546909 systemd[1]: Started /usr/bin/podman healthcheck run de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc.
Dec  5 06:46:01 np0005546909 podman[104450]: 2025-12-05 11:46:01.083328901 +0000 UTC m=+0.177925469 container init de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: + sudo -E kolla_set_configs
Dec  5 06:46:01 np0005546909 podman[104450]: 2025-12-05 11:46:01.112791567 +0000 UTC m=+0.207388095 container start de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec  5 06:46:01 np0005546909 edpm-start-podman-container[104450]: ovn_metadata_agent
Dec  5 06:46:01 np0005546909 edpm-start-podman-container[104449]: Creating additional drop-in dependency for "ovn_metadata_agent" (de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc)
Dec  5 06:46:01 np0005546909 podman[104472]: 2025-12-05 11:46:01.178331588 +0000 UTC m=+0.052582890 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  5 06:46:01 np0005546909 systemd[1]: Reloading.
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Validating config file
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Copying service configuration files
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Writing out command to execute
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: ++ cat /run_command
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: + CMD=neutron-ovn-metadata-agent
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: + ARGS=
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: + sudo kolla_copy_cacerts
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: + [[ ! -n '' ]]
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: + . kolla_extend_start
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: Running command: 'neutron-ovn-metadata-agent'
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: + umask 0022
Dec  5 06:46:01 np0005546909 ovn_metadata_agent[104466]: + exec neutron-ovn-metadata-agent
Dec  5 06:46:01 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:46:01 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:46:01 np0005546909 systemd[1]: Started ovn_metadata_agent container.
Dec  5 06:46:01 np0005546909 systemd[1]: session-21.scope: Deactivated successfully.
Dec  5 06:46:01 np0005546909 systemd[1]: session-21.scope: Consumed 33.945s CPU time.
Dec  5 06:46:01 np0005546909 systemd-logind[792]: Session 21 logged out. Waiting for processes to exit.
Dec  5 06:46:01 np0005546909 systemd-logind[792]: Removed session 21.
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.942 104471 INFO neutron.common.config [-] Logging enabled!#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.942 104471 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.943 104471 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.943 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.943 104471 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.943 104471 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.944 104471 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.945 104471 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.946 104471 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.947 104471 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.948 104471 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.949 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.950 104471 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.951 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.952 104471 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.953 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.954 104471 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.955 104471 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.956 104471 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.957 104471 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.957 104471 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.957 104471 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.957 104471 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.957 104471 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.957 104471 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.958 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.959 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.959 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.959 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.959 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.959 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.959 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.960 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.961 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.962 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.963 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.964 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.965 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.966 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.967 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.968 104471 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.969 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.970 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.971 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.972 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.973 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.974 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.975 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.976 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.977 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.978 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.978 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.978 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.978 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.978 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.978 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.979 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.980 104471 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.981 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.982 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.983 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.984 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.985 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.986 104471 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.986 104471 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.994 104471 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.995 104471 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.995 104471 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.995 104471 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Dec  5 06:46:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:02.995 104471 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.006 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 2686fa45-e88c-4058-8865-e810ceb89d95 (UUID: 2686fa45-e88c-4058-8865-e810ceb89d95) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.044 104471 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.045 104471 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.045 104471 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.045 104471 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.049 104471 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.057 104471 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.065 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '2686fa45-e88c-4058-8865-e810ceb89d95'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], external_ids={}, name=2686fa45-e88c-4058-8865-e810ceb89d95, nb_cfg_timestamp=1764935115276, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.066 104471 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f189ffbadc0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.067 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.068 104471 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.068 104471 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.068 104471 INFO oslo_service.service [-] Starting 1 workers#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.073 104471 DEBUG oslo_service.service [-] Started child 104579 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.076 104579 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-66947572'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.077 104471 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp3e9pio4y/privsep.sock']#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.100 104579 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.101 104579 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.101 104579 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.104 104579 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.111 104579 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.119 104579 INFO eventlet.wsgi.server [-] (104579) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Dec  5 06:46:03 np0005546909 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.720 104471 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.721 104471 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp3e9pio4y/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.585 104584 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.589 104584 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.592 104584 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.592 104584 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104584#033[00m
Dec  5 06:46:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:03.726 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c8b5586c-ea9a-4246-bbad-5a51e446d47b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.202 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.202 104584 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.203 104584 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.726 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc972f7-2f9e-4c5a-90b6-a1bdad82a09e]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.729 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, column=external_ids, values=({'neutron:ovn-metadata-id': '2017f5d6-7c32-5b30-92fd-9f8ba19f80f8'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.741 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.750 104471 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.751 104471 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.751 104471 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.752 104471 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.752 104471 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.752 104471 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.753 104471 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.753 104471 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.754 104471 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.754 104471 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.755 104471 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.755 104471 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.756 104471 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.756 104471 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.757 104471 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.757 104471 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.758 104471 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.758 104471 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.759 104471 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.759 104471 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.759 104471 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.760 104471 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.761 104471 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.761 104471 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.761 104471 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.761 104471 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.762 104471 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.762 104471 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.762 104471 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.762 104471 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.763 104471 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.763 104471 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.763 104471 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.763 104471 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.764 104471 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.764 104471 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.764 104471 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.764 104471 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.765 104471 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.765 104471 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.765 104471 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.765 104471 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.766 104471 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.766 104471 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.766 104471 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.766 104471 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.766 104471 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.767 104471 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.767 104471 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.767 104471 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.767 104471 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.767 104471 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.767 104471 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.768 104471 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.768 104471 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.768 104471 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.768 104471 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.768 104471 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.769 104471 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.769 104471 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.769 104471 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.769 104471 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.769 104471 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.770 104471 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.770 104471 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.770 104471 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.770 104471 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.770 104471 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.770 104471 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.771 104471 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.771 104471 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.771 104471 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.771 104471 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.771 104471 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.771 104471 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.772 104471 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.772 104471 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.772 104471 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.772 104471 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.772 104471 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.772 104471 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.773 104471 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.774 104471 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.775 104471 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.776 104471 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.776 104471 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.776 104471 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.776 104471 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.776 104471 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.776 104471 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.777 104471 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.777 104471 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.777 104471 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.777 104471 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.777 104471 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.777 104471 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.778 104471 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.778 104471 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.778 104471 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.778 104471 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.778 104471 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.778 104471 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.779 104471 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.779 104471 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.779 104471 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.779 104471 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.779 104471 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.780 104471 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.780 104471 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.780 104471 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.780 104471 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.780 104471 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.780 104471 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.781 104471 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.781 104471 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.781 104471 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.781 104471 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.781 104471 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.782 104471 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.782 104471 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.782 104471 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.782 104471 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.782 104471 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.783 104471 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.783 104471 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.783 104471 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.783 104471 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.783 104471 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.784 104471 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.784 104471 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.784 104471 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.784 104471 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.784 104471 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.785 104471 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.785 104471 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.785 104471 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.785 104471 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.785 104471 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.785 104471 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.786 104471 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.786 104471 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.786 104471 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.786 104471 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.786 104471 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.787 104471 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.788 104471 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.788 104471 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.788 104471 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.788 104471 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.788 104471 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.789 104471 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.789 104471 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.789 104471 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.789 104471 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.789 104471 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.789 104471 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.790 104471 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.790 104471 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.790 104471 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.790 104471 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.790 104471 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.790 104471 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.791 104471 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.792 104471 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.793 104471 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.793 104471 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.793 104471 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.793 104471 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.793 104471 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.793 104471 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.794 104471 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.795 104471 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.795 104471 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.795 104471 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.795 104471 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.795 104471 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.795 104471 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.796 104471 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.797 104471 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.797 104471 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.797 104471 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.797 104471 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.797 104471 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.797 104471 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.798 104471 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.798 104471 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.798 104471 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.798 104471 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.798 104471 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.798 104471 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.799 104471 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.799 104471 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.799 104471 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.799 104471 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.799 104471 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.799 104471 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.800 104471 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.801 104471 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.801 104471 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.801 104471 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.801 104471 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.801 104471 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.801 104471 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.802 104471 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.802 104471 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.802 104471 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.802 104471 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.802 104471 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.802 104471 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.803 104471 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.803 104471 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.803 104471 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.803 104471 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.803 104471 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.803 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.804 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.804 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.804 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.804 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.804 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.805 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.805 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.805 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.805 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.805 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.806 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.806 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.806 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.806 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.806 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.806 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.807 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.808 104471 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.809 104471 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.809 104471 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.809 104471 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.809 104471 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:46:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:46:04.809 104471 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  5 06:46:07 np0005546909 systemd-logind[792]: New session 22 of user zuul.
Dec  5 06:46:07 np0005546909 systemd[1]: Started Session 22 of User zuul.
Dec  5 06:46:07 np0005546909 podman[104591]: 2025-12-05 11:46:07.23169546 +0000 UTC m=+0.094690934 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 06:46:08 np0005546909 python3.9[104769]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:46:09 np0005546909 python3.9[104925]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:46:10 np0005546909 python3.9[105090]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:46:10 np0005546909 systemd[1]: Reloading.
Dec  5 06:46:10 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:46:10 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:46:11 np0005546909 python3.9[105275]: ansible-ansible.builtin.service_facts Invoked
Dec  5 06:46:12 np0005546909 network[105292]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 06:46:12 np0005546909 network[105293]: 'network-scripts' will be removed from distribution in near future.
Dec  5 06:46:12 np0005546909 network[105294]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 06:46:18 np0005546909 python3.9[105555]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:46:19 np0005546909 python3.9[105708]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:46:20 np0005546909 python3.9[105861]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:46:21 np0005546909 python3.9[106014]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:46:22 np0005546909 python3.9[106167]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:46:22 np0005546909 python3.9[106320]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:46:23 np0005546909 python3.9[106473]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:46:24 np0005546909 python3.9[106626]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:25 np0005546909 python3.9[106778]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:25 np0005546909 python3.9[106930]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:26 np0005546909 python3.9[107082]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:27 np0005546909 python3.9[107234]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:27 np0005546909 python3.9[107386]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:28 np0005546909 python3.9[107538]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:29 np0005546909 python3.9[107690]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:29 np0005546909 python3.9[107842]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:30 np0005546909 python3.9[107994]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:30 np0005546909 python3.9[108146]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:31 np0005546909 podman[108270]: 2025-12-05 11:46:31.479866508 +0000 UTC m=+0.053112460 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 06:46:31 np0005546909 python3.9[108316]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:32 np0005546909 python3.9[108469]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:32 np0005546909 python3.9[108621]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:46:33 np0005546909 python3.9[108773]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:46:34 np0005546909 python3.9[108925]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  5 06:46:35 np0005546909 python3.9[109077]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:46:35 np0005546909 systemd[1]: Reloading.
Dec  5 06:46:35 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:46:35 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:46:36 np0005546909 python3.9[109264]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:46:36 np0005546909 python3.9[109417]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:46:37 np0005546909 podman[109542]: 2025-12-05 11:46:37.535967112 +0000 UTC m=+0.149789326 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 06:46:37 np0005546909 python3.9[109581]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:46:38 np0005546909 python3.9[109747]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:46:38 np0005546909 python3.9[109900]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:46:39 np0005546909 python3.9[110053]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:46:40 np0005546909 python3.9[110206]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:46:41 np0005546909 python3.9[110359]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec  5 06:46:42 np0005546909 python3.9[110512]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  5 06:46:43 np0005546909 python3.9[110670]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  5 06:46:44 np0005546909 python3.9[110830]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:46:44 np0005546909 python3.9[110914]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:47:01 np0005546909 podman[111099]: 2025-12-05 11:47:01.790515582 +0000 UTC m=+0.087539571 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 06:47:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:47:02.988 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:47:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:47:02.988 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:47:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:47:02.989 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:47:08 np0005546909 podman[111124]: 2025-12-05 11:47:08.246007172 +0000 UTC m=+0.095043453 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 06:47:13 np0005546909 kernel: SELinux:  Converting 2757 SID table entries...
Dec  5 06:47:13 np0005546909 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:47:13 np0005546909 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:47:13 np0005546909 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:47:13 np0005546909 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:47:13 np0005546909 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:47:13 np0005546909 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:47:13 np0005546909 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:47:22 np0005546909 kernel: SELinux:  Converting 2757 SID table entries...
Dec  5 06:47:22 np0005546909 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:47:22 np0005546909 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:47:22 np0005546909 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:47:22 np0005546909 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:47:22 np0005546909 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:47:22 np0005546909 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:47:22 np0005546909 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:47:32 np0005546909 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec  5 06:47:32 np0005546909 podman[111167]: 2025-12-05 11:47:32.239758116 +0000 UTC m=+0.077208780 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  5 06:47:39 np0005546909 podman[114096]: 2025-12-05 11:47:39.250868383 +0000 UTC m=+0.093691341 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 06:48:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:48:02.989 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:48:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:48:02.989 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:48:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:48:02.990 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:48:03 np0005546909 podman[128002]: 2025-12-05 11:48:03.205568674 +0000 UTC m=+0.057632373 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 06:48:10 np0005546909 podman[128037]: 2025-12-05 11:48:10.22988219 +0000 UTC m=+0.082069776 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec  5 06:48:15 np0005546909 kernel: SELinux:  Converting 2758 SID table entries...
Dec  5 06:48:15 np0005546909 kernel: SELinux:  policy capability network_peer_controls=1
Dec  5 06:48:15 np0005546909 kernel: SELinux:  policy capability open_perms=1
Dec  5 06:48:15 np0005546909 kernel: SELinux:  policy capability extended_socket_class=1
Dec  5 06:48:15 np0005546909 kernel: SELinux:  policy capability always_check_network=0
Dec  5 06:48:15 np0005546909 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  5 06:48:15 np0005546909 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  5 06:48:15 np0005546909 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  5 06:48:16 np0005546909 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Dec  5 06:48:16 np0005546909 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec  5 06:48:16 np0005546909 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Dec  5 06:48:23 np0005546909 systemd[1]: Stopping OpenSSH server daemon...
Dec  5 06:48:23 np0005546909 systemd[1]: sshd.service: Deactivated successfully.
Dec  5 06:48:23 np0005546909 systemd[1]: Stopped OpenSSH server daemon.
Dec  5 06:48:23 np0005546909 systemd[1]: sshd.service: Consumed 2.960s CPU time, read 32.0K from disk, written 4.0K to disk.
Dec  5 06:48:23 np0005546909 systemd[1]: Stopped target sshd-keygen.target.
Dec  5 06:48:23 np0005546909 systemd[1]: Stopping sshd-keygen.target...
Dec  5 06:48:23 np0005546909 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  5 06:48:23 np0005546909 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  5 06:48:23 np0005546909 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  5 06:48:23 np0005546909 systemd[1]: Reached target sshd-keygen.target.
Dec  5 06:48:23 np0005546909 systemd[1]: Starting OpenSSH server daemon...
Dec  5 06:48:23 np0005546909 systemd[1]: Started OpenSSH server daemon.
Dec  5 06:48:25 np0005546909 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 06:48:25 np0005546909 systemd[1]: Starting man-db-cache-update.service...
Dec  5 06:48:26 np0005546909 systemd[1]: Reloading.
Dec  5 06:48:26 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:48:26 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:48:26 np0005546909 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 06:48:30 np0005546909 python3.9[134081]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 06:48:30 np0005546909 systemd[1]: Reloading.
Dec  5 06:48:30 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:48:30 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:48:31 np0005546909 python3.9[135327]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 06:48:31 np0005546909 systemd[1]: Reloading.
Dec  5 06:48:31 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:48:31 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:48:32 np0005546909 python3.9[136548]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 06:48:32 np0005546909 systemd[1]: Reloading.
Dec  5 06:48:32 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:48:33 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:48:33 np0005546909 podman[137771]: 2025-12-05 11:48:33.645977819 +0000 UTC m=+0.060523163 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  5 06:48:33 np0005546909 python3.9[137895]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 06:48:34 np0005546909 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 06:48:34 np0005546909 systemd[1]: Finished man-db-cache-update.service.
Dec  5 06:48:34 np0005546909 systemd[1]: man-db-cache-update.service: Consumed 10.293s CPU time.
Dec  5 06:48:34 np0005546909 systemd[1]: run-r11ad2af578eb459cb50d0e03494c13de.service: Deactivated successfully.
Dec  5 06:48:34 np0005546909 systemd[1]: Reloading.
Dec  5 06:48:34 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:48:34 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:48:35 np0005546909 python3.9[138391]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:35 np0005546909 systemd[1]: Reloading.
Dec  5 06:48:35 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:48:35 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:48:36 np0005546909 python3.9[138582]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:36 np0005546909 systemd[1]: Reloading.
Dec  5 06:48:36 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:48:36 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:48:37 np0005546909 python3.9[138772]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:37 np0005546909 systemd[1]: Reloading.
Dec  5 06:48:37 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:48:37 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:48:38 np0005546909 python3.9[138962]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:39 np0005546909 python3.9[139117]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:39 np0005546909 systemd[1]: Reloading.
Dec  5 06:48:39 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:48:39 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:48:40 np0005546909 podman[139278]: 2025-12-05 11:48:40.417111205 +0000 UTC m=+0.096780448 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 06:48:40 np0005546909 python3.9[139325]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  5 06:48:40 np0005546909 systemd[1]: Reloading.
Dec  5 06:48:40 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:48:40 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:48:41 np0005546909 systemd[1]: Listening on libvirt proxy daemon socket.
Dec  5 06:48:41 np0005546909 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec  5 06:48:41 np0005546909 python3.9[139526]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:42 np0005546909 python3.9[139681]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:43 np0005546909 python3.9[139836]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:44 np0005546909 python3.9[139991]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:45 np0005546909 python3.9[140146]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:46 np0005546909 python3.9[140301]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:47 np0005546909 python3.9[140456]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:47 np0005546909 python3.9[140611]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:50 np0005546909 python3.9[140766]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:50 np0005546909 python3.9[140921]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:51 np0005546909 python3.9[141076]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:52 np0005546909 python3.9[141231]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:53 np0005546909 python3.9[141386]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:54 np0005546909 python3.9[141541]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  5 06:48:55 np0005546909 python3.9[141696]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:48:55 np0005546909 python3.9[141848]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:48:56 np0005546909 python3.9[142000]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:48:57 np0005546909 python3.9[142152]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:48:57 np0005546909 python3.9[142304]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:48:58 np0005546909 python3.9[142456]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:48:59 np0005546909 python3.9[142608]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:00 np0005546909 python3.9[142733]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935338.8307583-554-171782957020572/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:00 np0005546909 python3.9[142885]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:01 np0005546909 python3.9[143010]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935340.380457-554-254657543180213/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:02 np0005546909 python3.9[143162]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:02 np0005546909 python3.9[143287]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935341.596943-554-11611920050830/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:49:02.990 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:49:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:49:02.992 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:49:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:49:02.992 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:49:03 np0005546909 python3.9[143439]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:03 np0005546909 podman[143536]: 2025-12-05 11:49:03.846405463 +0000 UTC m=+0.070176913 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 06:49:04 np0005546909 python3.9[143579]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935342.9292514-554-219466482027564/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:04 np0005546909 python3.9[143733]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:05 np0005546909 python3.9[143858]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935344.2016222-554-201719294833372/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:05 np0005546909 python3.9[144010]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:06 np0005546909 python3.9[144135]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935345.437748-554-213687459831911/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:07 np0005546909 python3.9[144287]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:07 np0005546909 python3.9[144410]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935346.65228-554-195734451696297/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:08 np0005546909 python3.9[144562]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:08 np0005546909 python3.9[144687]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764935347.8445086-554-126216714884499/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:09 np0005546909 python3.9[144839]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec  5 06:49:10 np0005546909 python3.9[144992]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:10 np0005546909 podman[145116]: 2025-12-05 11:49:10.912222149 +0000 UTC m=+0.087933080 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 06:49:11 np0005546909 python3.9[145161]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:11 np0005546909 python3.9[145322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:12 np0005546909 python3.9[145474]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:13 np0005546909 python3.9[145626]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:13 np0005546909 python3.9[145778]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:14 np0005546909 python3.9[145930]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:15 np0005546909 python3.9[146082]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:15 np0005546909 python3.9[146234]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:16 np0005546909 python3.9[146386]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:17 np0005546909 python3.9[146538]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:17 np0005546909 python3.9[146690]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:18 np0005546909 python3.9[146842]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:19 np0005546909 python3.9[146994]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:20 np0005546909 python3.9[147146]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:20 np0005546909 python3.9[147269]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935359.5988936-775-263359113316731/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:21 np0005546909 python3.9[147421]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:22 np0005546909 python3.9[147544]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935360.8178632-775-215581831471210/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:22 np0005546909 python3.9[147696]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:23 np0005546909 python3.9[147819]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935362.2192955-775-105121960303728/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:24 np0005546909 python3.9[147971]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:24 np0005546909 python3.9[148094]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935363.5355566-775-129591414834584/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:25 np0005546909 python3.9[148246]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:26 np0005546909 python3.9[148369]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935364.9294007-775-108819657828384/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:26 np0005546909 python3.9[148521]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:27 np0005546909 python3.9[148644]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935366.3013575-775-238325479842852/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:28 np0005546909 python3.9[148796]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:28 np0005546909 python3.9[148919]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935367.5300677-775-130882944250722/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:29 np0005546909 python3.9[149071]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:30 np0005546909 python3.9[149194]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935368.8490222-775-118927576611400/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:30 np0005546909 python3.9[149346]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:31 np0005546909 python3.9[149469]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935370.1985319-775-56974481441264/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:31 np0005546909 python3.9[149621]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:32 np0005546909 python3.9[149744]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935371.3998506-775-180221470410513/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:33 np0005546909 python3.9[149896]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:33 np0005546909 python3.9[150019]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935372.7705736-775-250998152773812/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:34 np0005546909 podman[150090]: 2025-12-05 11:49:34.234655074 +0000 UTC m=+0.083845293 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 06:49:34 np0005546909 python3.9[150191]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:35 np0005546909 python3.9[150314]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935374.038747-775-177884112333369/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:35 np0005546909 python3.9[150466]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:36 np0005546909 python3.9[150589]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935375.4071891-775-198862746481053/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:37 np0005546909 python3.9[150741]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:37 np0005546909 python3.9[150864]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935376.6753912-775-193406779632992/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:38 np0005546909 python3.9[151014]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:49:39 np0005546909 python3.9[151169]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec  5 06:49:41 np0005546909 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec  5 06:49:41 np0005546909 podman[151282]: 2025-12-05 11:49:41.296132401 +0000 UTC m=+0.124650186 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  5 06:49:41 np0005546909 python3.9[151351]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:42 np0005546909 python3.9[151503]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:42 np0005546909 python3.9[151655]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:43 np0005546909 python3.9[151807]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:44 np0005546909 python3.9[151959]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:44 np0005546909 python3.9[152111]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:45 np0005546909 python3.9[152263]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:46 np0005546909 python3.9[152415]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:47 np0005546909 python3.9[152567]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:47 np0005546909 python3.9[152719]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:49 np0005546909 python3.9[152871]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:49:49 np0005546909 systemd[1]: Reloading.
Dec  5 06:49:49 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:49:49 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:49:49 np0005546909 systemd[1]: Starting libvirt logging daemon socket...
Dec  5 06:49:49 np0005546909 systemd[1]: Listening on libvirt logging daemon socket.
Dec  5 06:49:49 np0005546909 systemd[1]: Starting libvirt logging daemon admin socket...
Dec  5 06:49:49 np0005546909 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec  5 06:49:49 np0005546909 systemd[1]: Starting libvirt logging daemon...
Dec  5 06:49:49 np0005546909 systemd[1]: Started libvirt logging daemon.
Dec  5 06:49:50 np0005546909 python3.9[153064]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:49:50 np0005546909 systemd[1]: Reloading.
Dec  5 06:49:50 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:49:50 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:49:51 np0005546909 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec  5 06:49:51 np0005546909 systemd[1]: Starting libvirt nodedev daemon socket...
Dec  5 06:49:51 np0005546909 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec  5 06:49:51 np0005546909 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec  5 06:49:51 np0005546909 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec  5 06:49:51 np0005546909 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec  5 06:49:51 np0005546909 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec  5 06:49:51 np0005546909 systemd[1]: Starting libvirt nodedev daemon...
Dec  5 06:49:51 np0005546909 systemd[1]: Started libvirt nodedev daemon.
Dec  5 06:49:51 np0005546909 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec  5 06:49:51 np0005546909 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec  5 06:49:51 np0005546909 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec  5 06:49:51 np0005546909 python3.9[153289]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:49:51 np0005546909 systemd[1]: Reloading.
Dec  5 06:49:52 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:49:52 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:49:52 np0005546909 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec  5 06:49:52 np0005546909 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec  5 06:49:52 np0005546909 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec  5 06:49:52 np0005546909 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec  5 06:49:52 np0005546909 systemd[1]: Starting libvirt proxy daemon...
Dec  5 06:49:52 np0005546909 systemd[1]: Started libvirt proxy daemon.
Dec  5 06:49:52 np0005546909 setroubleshoot[153101]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 2c1840a9-6925-41dc-8e3a-a6a8d8978d2f
Dec  5 06:49:52 np0005546909 setroubleshoot[153101]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  5 06:49:52 np0005546909 setroubleshoot[153101]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 2c1840a9-6925-41dc-8e3a-a6a8d8978d2f
Dec  5 06:49:52 np0005546909 setroubleshoot[153101]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  5 06:49:53 np0005546909 python3.9[153503]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:49:53 np0005546909 systemd[1]: Reloading.
Dec  5 06:49:53 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:49:53 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:49:53 np0005546909 systemd[1]: Listening on libvirt locking daemon socket.
Dec  5 06:49:53 np0005546909 systemd[1]: Starting libvirt QEMU daemon socket...
Dec  5 06:49:53 np0005546909 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec  5 06:49:53 np0005546909 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec  5 06:49:53 np0005546909 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec  5 06:49:53 np0005546909 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec  5 06:49:53 np0005546909 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec  5 06:49:53 np0005546909 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec  5 06:49:53 np0005546909 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec  5 06:49:53 np0005546909 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec  5 06:49:53 np0005546909 systemd[1]: Starting libvirt QEMU daemon...
Dec  5 06:49:53 np0005546909 systemd[1]: Started libvirt QEMU daemon.
Dec  5 06:49:54 np0005546909 python3.9[153718]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:49:54 np0005546909 systemd[1]: Reloading.
Dec  5 06:49:54 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:49:54 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:49:54 np0005546909 systemd[1]: Starting libvirt secret daemon socket...
Dec  5 06:49:54 np0005546909 systemd[1]: Listening on libvirt secret daemon socket.
Dec  5 06:49:54 np0005546909 systemd[1]: Starting libvirt secret daemon admin socket...
Dec  5 06:49:54 np0005546909 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec  5 06:49:54 np0005546909 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec  5 06:49:54 np0005546909 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec  5 06:49:55 np0005546909 systemd[1]: Starting libvirt secret daemon...
Dec  5 06:49:55 np0005546909 systemd[1]: Started libvirt secret daemon.
Dec  5 06:49:55 np0005546909 python3.9[153930]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:56 np0005546909 python3.9[154082]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  5 06:49:57 np0005546909 python3.9[154234]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:49:58 np0005546909 python3.9[154357]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935396.9463062-1120-92344443357151/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:59 np0005546909 python3.9[154509]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:49:59 np0005546909 python3.9[154661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:50:00 np0005546909 python3.9[154739]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:01 np0005546909 python3.9[154891]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:50:01 np0005546909 python3.9[154969]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.zuvd0a23 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:02 np0005546909 python3.9[155121]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:50:02 np0005546909 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec  5 06:50:02 np0005546909 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec  5 06:50:02 np0005546909 python3.9[155199]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:50:02.992 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:50:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:50:02.993 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:50:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:50:02.993 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:50:03 np0005546909 python3.9[155351]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:50:04 np0005546909 podman[155476]: 2025-12-05 11:50:04.393872397 +0000 UTC m=+0.059061578 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 06:50:04 np0005546909 python3[155520]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  5 06:50:05 np0005546909 python3.9[155672]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:50:05 np0005546909 python3.9[155750]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:06 np0005546909 python3.9[155902]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:50:07 np0005546909 python3.9[155980]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:07 np0005546909 python3.9[156132]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:50:08 np0005546909 python3.9[156210]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:09 np0005546909 python3.9[156362]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:50:09 np0005546909 python3.9[156440]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:10 np0005546909 python3.9[156592]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:50:11 np0005546909 python3.9[156717]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935409.9812627-1245-171749698121015/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:11 np0005546909 podman[156841]: 2025-12-05 11:50:11.801459497 +0000 UTC m=+0.115112908 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 06:50:11 np0005546909 python3.9[156886]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:12 np0005546909 python3.9[157047]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:50:13 np0005546909 python3.9[157202]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:14 np0005546909 python3.9[157354]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:50:15 np0005546909 python3.9[157507]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:50:16 np0005546909 python3.9[157661]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:50:17 np0005546909 python3.9[157816]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:18 np0005546909 python3.9[157968]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:50:18 np0005546909 python3.9[158091]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935417.4288177-1317-230268456682157/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:19 np0005546909 python3.9[158243]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:50:20 np0005546909 python3.9[158366]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935418.8997276-1332-2597566670873/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:20 np0005546909 python3.9[158518]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:50:21 np0005546909 python3.9[158641]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935420.2872055-1347-82813606919032/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:22 np0005546909 python3.9[158793]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:50:22 np0005546909 systemd[1]: Reloading.
Dec  5 06:50:22 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:50:22 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:50:22 np0005546909 systemd[1]: Reached target edpm_libvirt.target.
Dec  5 06:50:23 np0005546909 python3.9[158983]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  5 06:50:23 np0005546909 systemd[1]: Reloading.
Dec  5 06:50:23 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:50:23 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:50:24 np0005546909 systemd[1]: Reloading.
Dec  5 06:50:24 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:50:24 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:50:24 np0005546909 systemd[1]: session-22.scope: Deactivated successfully.
Dec  5 06:50:24 np0005546909 systemd[1]: session-22.scope: Consumed 3min 31.057s CPU time.
Dec  5 06:50:24 np0005546909 systemd-logind[792]: Session 22 logged out. Waiting for processes to exit.
Dec  5 06:50:24 np0005546909 systemd-logind[792]: Removed session 22.
Dec  5 06:50:30 np0005546909 systemd-logind[792]: New session 23 of user zuul.
Dec  5 06:50:30 np0005546909 systemd[1]: Started Session 23 of User zuul.
Dec  5 06:50:31 np0005546909 python3.9[159234]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:50:33 np0005546909 python3.9[159388]: ansible-ansible.builtin.service_facts Invoked
Dec  5 06:50:33 np0005546909 network[159405]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 06:50:33 np0005546909 network[159406]: 'network-scripts' will be removed from distribution in near future.
Dec  5 06:50:33 np0005546909 network[159407]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 06:50:34 np0005546909 podman[159420]: 2025-12-05 11:50:34.554647339 +0000 UTC m=+0.076887720 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec  5 06:50:38 np0005546909 python3.9[159697]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  5 06:50:39 np0005546909 python3.9[159781]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:50:42 np0005546909 podman[159783]: 2025-12-05 11:50:42.303816617 +0000 UTC m=+0.142825970 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 06:50:45 np0005546909 python3.9[159960]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:50:46 np0005546909 python3.9[160112]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:50:47 np0005546909 python3.9[160265]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:50:47 np0005546909 python3.9[160417]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:50:49 np0005546909 python3.9[160570]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:50:49 np0005546909 python3.9[160693]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935447.8522537-95-8448953494723/.source.iscsi _original_basename=.8bkcp8sd follow=False checksum=a829c6ed530b00b3536c4c41b581253018e4d1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:50 np0005546909 python3.9[160845]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:51 np0005546909 python3.9[160997]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:50:51 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 06:50:51 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 06:50:53 np0005546909 python3.9[161150]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:50:53 np0005546909 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec  5 06:50:54 np0005546909 python3.9[161306]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:50:54 np0005546909 systemd[1]: Reloading.
Dec  5 06:50:54 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:50:54 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:50:54 np0005546909 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  5 06:50:54 np0005546909 systemd[1]: Starting Open-iSCSI...
Dec  5 06:50:54 np0005546909 kernel: Loading iSCSI transport class v2.0-870.
Dec  5 06:50:54 np0005546909 systemd[1]: Started Open-iSCSI.
Dec  5 06:50:54 np0005546909 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec  5 06:50:54 np0005546909 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec  5 06:50:55 np0005546909 python3.9[161507]: ansible-ansible.builtin.service_facts Invoked
Dec  5 06:50:55 np0005546909 network[161524]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 06:50:55 np0005546909 network[161525]: 'network-scripts' will be removed from distribution in near future.
Dec  5 06:50:55 np0005546909 network[161526]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 06:51:00 np0005546909 python3.9[161797]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  5 06:51:01 np0005546909 python3.9[161949]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec  5 06:51:02 np0005546909 python3.9[162105]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:51:02 np0005546909 python3.9[162228]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935461.801045-172-148488548558277/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:51:02.994 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:51:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:51:02.996 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:51:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:51:02.996 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:51:03 np0005546909 python3.9[162380]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:04 np0005546909 podman[162504]: 2025-12-05 11:51:04.965881486 +0000 UTC m=+0.071293502 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 06:51:05 np0005546909 python3.9[162551]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:51:05 np0005546909 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  5 06:51:05 np0005546909 systemd[1]: Stopped Load Kernel Modules.
Dec  5 06:51:05 np0005546909 systemd[1]: Stopping Load Kernel Modules...
Dec  5 06:51:05 np0005546909 systemd[1]: Starting Load Kernel Modules...
Dec  5 06:51:05 np0005546909 systemd[1]: Finished Load Kernel Modules.
Dec  5 06:51:06 np0005546909 python3.9[162708]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:51:06 np0005546909 python3.9[162860]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:51:07 np0005546909 python3.9[163012]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:51:08 np0005546909 python3.9[163164]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:51:08 np0005546909 python3.9[163287]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935467.7157578-230-117262560434775/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:09 np0005546909 python3.9[163439]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:51:10 np0005546909 python3.9[163592]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:11 np0005546909 python3.9[163744]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:11 np0005546909 python3.9[163896]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:12 np0005546909 podman[164020]: 2025-12-05 11:51:12.585751182 +0000 UTC m=+0.107043850 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec  5 06:51:12 np0005546909 python3.9[164068]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:13 np0005546909 python3.9[164226]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:13 np0005546909 python3.9[164378]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:14 np0005546909 python3.9[164530]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:15 np0005546909 python3.9[164682]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:51:16 np0005546909 python3.9[164838]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:17 np0005546909 python3.9[164990]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:51:17 np0005546909 python3.9[165142]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:51:18 np0005546909 python3.9[165220]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:51:18 np0005546909 python3.9[165372]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:51:19 np0005546909 python3.9[165450]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:51:20 np0005546909 python3.9[165602]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:20 np0005546909 python3.9[165754]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:51:21 np0005546909 python3.9[165832]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:21 np0005546909 python3.9[165984]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:51:22 np0005546909 python3.9[166062]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:23 np0005546909 python3.9[166214]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:51:23 np0005546909 systemd[1]: Reloading.
Dec  5 06:51:23 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:51:23 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:51:24 np0005546909 python3.9[166404]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:51:24 np0005546909 python3.9[166482]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:25 np0005546909 python3.9[166634]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:51:25 np0005546909 python3.9[166712]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:26 np0005546909 python3.9[166864]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:51:26 np0005546909 systemd[1]: Reloading.
Dec  5 06:51:26 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:51:26 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:51:26 np0005546909 systemd[1]: Starting Create netns directory...
Dec  5 06:51:26 np0005546909 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  5 06:51:26 np0005546909 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  5 06:51:26 np0005546909 systemd[1]: Finished Create netns directory.
Dec  5 06:51:27 np0005546909 python3.9[167057]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:51:28 np0005546909 python3.9[167209]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:51:29 np0005546909 python3.9[167332]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935487.9978871-437-7665812514565/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:51:29 np0005546909 python3.9[167484]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:51:30 np0005546909 python3.9[167636]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:51:31 np0005546909 python3.9[167759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935490.22195-462-76224646519653/.source.json _original_basename=.2bjuzt9u follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:32 np0005546909 python3.9[167911]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:34 np0005546909 python3.9[168338]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec  5 06:51:35 np0005546909 podman[168462]: 2025-12-05 11:51:35.244671223 +0000 UTC m=+0.103098662 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  5 06:51:35 np0005546909 python3.9[168505]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 06:51:36 np0005546909 python3.9[168660]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  5 06:51:37 np0005546909 python3[168838]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 06:51:38 np0005546909 podman[168874]: 2025-12-05 11:51:38.015420378 +0000 UTC m=+0.020683975 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  5 06:51:38 np0005546909 podman[168874]: 2025-12-05 11:51:38.112205533 +0000 UTC m=+0.117469150 container create 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec  5 06:51:38 np0005546909 python3[168838]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  5 06:51:38 np0005546909 python3.9[169064]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:51:39 np0005546909 python3.9[169218]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:40 np0005546909 python3.9[169294]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:51:41 np0005546909 python3.9[169445]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935500.363893-550-81140421315706/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:41 np0005546909 python3.9[169521]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:51:41 np0005546909 systemd[1]: Reloading.
Dec  5 06:51:41 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:51:41 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:51:42 np0005546909 python3.9[169632]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:51:42 np0005546909 systemd[1]: Reloading.
Dec  5 06:51:42 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:51:42 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:51:42 np0005546909 podman[169634]: 2025-12-05 11:51:42.95383642 +0000 UTC m=+0.142466687 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 06:51:43 np0005546909 systemd[1]: Starting multipathd container...
Dec  5 06:51:43 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:51:43 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb970a33319bf8af9cd2c940319545f1eca747c8eaa6651d1fb78f4034e99405/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  5 06:51:43 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb970a33319bf8af9cd2c940319545f1eca747c8eaa6651d1fb78f4034e99405/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  5 06:51:43 np0005546909 systemd[1]: Started /usr/bin/podman healthcheck run 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.
Dec  5 06:51:43 np0005546909 podman[169698]: 2025-12-05 11:51:43.317072311 +0000 UTC m=+0.146495353 container init 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 06:51:43 np0005546909 multipathd[169712]: + sudo -E kolla_set_configs
Dec  5 06:51:43 np0005546909 podman[169698]: 2025-12-05 11:51:43.347680706 +0000 UTC m=+0.177103688 container start 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 06:51:43 np0005546909 podman[169698]: multipathd
Dec  5 06:51:43 np0005546909 systemd[1]: Started multipathd container.
Dec  5 06:51:43 np0005546909 multipathd[169712]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 06:51:43 np0005546909 multipathd[169712]: INFO:__main__:Validating config file
Dec  5 06:51:43 np0005546909 multipathd[169712]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 06:51:43 np0005546909 multipathd[169712]: INFO:__main__:Writing out command to execute
Dec  5 06:51:43 np0005546909 multipathd[169712]: ++ cat /run_command
Dec  5 06:51:43 np0005546909 multipathd[169712]: + CMD='/usr/sbin/multipathd -d'
Dec  5 06:51:43 np0005546909 multipathd[169712]: + ARGS=
Dec  5 06:51:43 np0005546909 multipathd[169712]: + sudo kolla_copy_cacerts
Dec  5 06:51:43 np0005546909 multipathd[169712]: + [[ ! -n '' ]]
Dec  5 06:51:43 np0005546909 multipathd[169712]: + . kolla_extend_start
Dec  5 06:51:43 np0005546909 multipathd[169712]: Running command: '/usr/sbin/multipathd -d'
Dec  5 06:51:43 np0005546909 multipathd[169712]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  5 06:51:43 np0005546909 multipathd[169712]: + umask 0022
Dec  5 06:51:43 np0005546909 multipathd[169712]: + exec /usr/sbin/multipathd -d
Dec  5 06:51:43 np0005546909 podman[169719]: 2025-12-05 11:51:43.456572189 +0000 UTC m=+0.093840579 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec  5 06:51:43 np0005546909 systemd[1]: 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb-5fbcdf1bc53c9d85.service: Main process exited, code=exited, status=1/FAILURE
Dec  5 06:51:43 np0005546909 systemd[1]: 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb-5fbcdf1bc53c9d85.service: Failed with result 'exit-code'.
Dec  5 06:51:43 np0005546909 multipathd[169712]: 2884.091357 | --------start up--------
Dec  5 06:51:43 np0005546909 multipathd[169712]: 2884.091385 | read /etc/multipath.conf
Dec  5 06:51:43 np0005546909 multipathd[169712]: 2884.097980 | path checkers start up
Dec  5 06:51:46 np0005546909 python3.9[169900]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:51:46 np0005546909 python3.9[170054]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:51:47 np0005546909 python3.9[170219]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:51:47 np0005546909 systemd[1]: Stopping multipathd container...
Dec  5 06:51:48 np0005546909 multipathd[169712]: 2888.915758 | exit (signal)
Dec  5 06:51:48 np0005546909 multipathd[169712]: 2888.915829 | --------shut down-------
Dec  5 06:51:48 np0005546909 systemd[1]: libpod-164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.scope: Deactivated successfully.
Dec  5 06:51:48 np0005546909 podman[170223]: 2025-12-05 11:51:48.32687779 +0000 UTC m=+0.463266053 container died 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 06:51:48 np0005546909 systemd[1]: 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb-5fbcdf1bc53c9d85.timer: Deactivated successfully.
Dec  5 06:51:48 np0005546909 systemd[1]: Stopped /usr/bin/podman healthcheck run 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.
Dec  5 06:51:48 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb-userdata-shm.mount: Deactivated successfully.
Dec  5 06:51:48 np0005546909 systemd[1]: var-lib-containers-storage-overlay-fb970a33319bf8af9cd2c940319545f1eca747c8eaa6651d1fb78f4034e99405-merged.mount: Deactivated successfully.
Dec  5 06:51:48 np0005546909 podman[170223]: 2025-12-05 11:51:48.389805004 +0000 UTC m=+0.526193297 container cleanup 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  5 06:51:48 np0005546909 podman[170223]: multipathd
Dec  5 06:51:48 np0005546909 podman[170254]: multipathd
Dec  5 06:51:48 np0005546909 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec  5 06:51:48 np0005546909 systemd[1]: Stopped multipathd container.
Dec  5 06:51:48 np0005546909 systemd[1]: Starting multipathd container...
Dec  5 06:51:48 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:51:48 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb970a33319bf8af9cd2c940319545f1eca747c8eaa6651d1fb78f4034e99405/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  5 06:51:48 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb970a33319bf8af9cd2c940319545f1eca747c8eaa6651d1fb78f4034e99405/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  5 06:51:48 np0005546909 systemd[1]: Started /usr/bin/podman healthcheck run 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.
Dec  5 06:51:48 np0005546909 podman[170267]: 2025-12-05 11:51:48.627645318 +0000 UTC m=+0.143277518 container init 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  5 06:51:48 np0005546909 multipathd[170283]: + sudo -E kolla_set_configs
Dec  5 06:51:48 np0005546909 podman[170267]: 2025-12-05 11:51:48.67259516 +0000 UTC m=+0.188227310 container start 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 06:51:48 np0005546909 podman[170267]: multipathd
Dec  5 06:51:48 np0005546909 systemd[1]: Started multipathd container.
Dec  5 06:51:48 np0005546909 multipathd[170283]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 06:51:48 np0005546909 multipathd[170283]: INFO:__main__:Validating config file
Dec  5 06:51:48 np0005546909 multipathd[170283]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 06:51:48 np0005546909 multipathd[170283]: INFO:__main__:Writing out command to execute
Dec  5 06:51:48 np0005546909 multipathd[170283]: ++ cat /run_command
Dec  5 06:51:48 np0005546909 multipathd[170283]: + CMD='/usr/sbin/multipathd -d'
Dec  5 06:51:48 np0005546909 multipathd[170283]: + ARGS=
Dec  5 06:51:48 np0005546909 multipathd[170283]: + sudo kolla_copy_cacerts
Dec  5 06:51:48 np0005546909 podman[170290]: 2025-12-05 11:51:48.779553163 +0000 UTC m=+0.085442468 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  5 06:51:48 np0005546909 multipathd[170283]: + [[ ! -n '' ]]
Dec  5 06:51:48 np0005546909 multipathd[170283]: + . kolla_extend_start
Dec  5 06:51:48 np0005546909 multipathd[170283]: Running command: '/usr/sbin/multipathd -d'
Dec  5 06:51:48 np0005546909 multipathd[170283]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  5 06:51:48 np0005546909 multipathd[170283]: + umask 0022
Dec  5 06:51:48 np0005546909 multipathd[170283]: + exec /usr/sbin/multipathd -d
Dec  5 06:51:48 np0005546909 systemd[1]: 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb-3b598ae60d2c9b07.service: Main process exited, code=exited, status=1/FAILURE
Dec  5 06:51:48 np0005546909 systemd[1]: 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb-3b598ae60d2c9b07.service: Failed with result 'exit-code'.
Dec  5 06:51:48 np0005546909 multipathd[170283]: 2889.422783 | --------start up--------
Dec  5 06:51:48 np0005546909 multipathd[170283]: 2889.422800 | read /etc/multipath.conf
Dec  5 06:51:48 np0005546909 multipathd[170283]: 2889.429575 | path checkers start up
Dec  5 06:51:49 np0005546909 python3.9[170471]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:50 np0005546909 python3.9[170624]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  5 06:51:51 np0005546909 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec  5 06:51:51 np0005546909 python3.9[170776]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec  5 06:51:51 np0005546909 kernel: Key type psk registered
Dec  5 06:51:52 np0005546909 python3.9[170941]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:51:52 np0005546909 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  5 06:51:52 np0005546909 python3.9[171064]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935511.5218735-630-94802856244009/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:53 np0005546909 python3.9[171217]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:51:53 np0005546909 systemd[1]: virtqemud.service: Deactivated successfully.
Dec  5 06:51:54 np0005546909 python3.9[171370]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:51:54 np0005546909 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  5 06:51:54 np0005546909 systemd[1]: Stopped Load Kernel Modules.
Dec  5 06:51:54 np0005546909 systemd[1]: Stopping Load Kernel Modules...
Dec  5 06:51:54 np0005546909 systemd[1]: Starting Load Kernel Modules...
Dec  5 06:51:54 np0005546909 systemd[1]: Finished Load Kernel Modules.
Dec  5 06:51:54 np0005546909 python3.9[171526]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  5 06:51:55 np0005546909 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  5 06:51:57 np0005546909 systemd[1]: Reloading.
Dec  5 06:51:58 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:51:58 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:51:58 np0005546909 systemd[1]: Reloading.
Dec  5 06:51:58 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:51:58 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:51:58 np0005546909 systemd-logind[792]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  5 06:51:58 np0005546909 systemd-logind[792]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  5 06:51:59 np0005546909 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  5 06:51:59 np0005546909 systemd[1]: Starting man-db-cache-update.service...
Dec  5 06:51:59 np0005546909 systemd[1]: Reloading.
Dec  5 06:51:59 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:51:59 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:51:59 np0005546909 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  5 06:52:01 np0005546909 python3.9[172944]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:52:01 np0005546909 systemd[1]: Stopping Open-iSCSI...
Dec  5 06:52:01 np0005546909 iscsid[161346]: iscsid shutting down.
Dec  5 06:52:01 np0005546909 systemd[1]: iscsid.service: Deactivated successfully.
Dec  5 06:52:01 np0005546909 systemd[1]: Stopped Open-iSCSI.
Dec  5 06:52:01 np0005546909 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  5 06:52:01 np0005546909 systemd[1]: Starting Open-iSCSI...
Dec  5 06:52:01 np0005546909 systemd[1]: Started Open-iSCSI.
Dec  5 06:52:01 np0005546909 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  5 06:52:01 np0005546909 systemd[1]: Finished man-db-cache-update.service.
Dec  5 06:52:01 np0005546909 systemd[1]: man-db-cache-update.service: Consumed 1.778s CPU time.
Dec  5 06:52:01 np0005546909 systemd[1]: run-rd075b8d777104360a5fd69052d17ebe9.service: Deactivated successfully.
Dec  5 06:52:02 np0005546909 python3.9[173134]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:52:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:52:02.996 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:52:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:52:02.997 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:52:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:52:02.998 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:52:03 np0005546909 python3.9[173290]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:04 np0005546909 python3.9[173442]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:52:04 np0005546909 systemd[1]: Reloading.
Dec  5 06:52:04 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:52:04 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:52:05 np0005546909 python3.9[173629]: ansible-ansible.builtin.service_facts Invoked
Dec  5 06:52:05 np0005546909 network[173646]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 06:52:05 np0005546909 network[173647]: 'network-scripts' will be removed from distribution in near future.
Dec  5 06:52:05 np0005546909 network[173648]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 06:52:05 np0005546909 podman[173653]: 2025-12-05 11:52:05.712415288 +0000 UTC m=+0.086091535 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  5 06:52:10 np0005546909 python3.9[173939]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:52:11 np0005546909 python3.9[174092]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:52:12 np0005546909 python3.9[174245]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:52:13 np0005546909 python3.9[174398]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:52:13 np0005546909 podman[174399]: 2025-12-05 11:52:13.250723002 +0000 UTC m=+0.100372280 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 06:52:13 np0005546909 python3.9[174577]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:52:14 np0005546909 python3.9[174730]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:52:15 np0005546909 python3.9[174883]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:52:16 np0005546909 python3.9[175036]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:52:17 np0005546909 python3.9[175189]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:17 np0005546909 python3.9[175341]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:18 np0005546909 python3.9[175493]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:18 np0005546909 podman[175617]: 2025-12-05 11:52:18.895822725 +0000 UTC m=+0.054313869 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec  5 06:52:19 np0005546909 python3.9[175665]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:19 np0005546909 python3.9[175817]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:20 np0005546909 python3.9[175969]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:21 np0005546909 python3.9[176121]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:21 np0005546909 python3.9[176273]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:22 np0005546909 python3.9[176425]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:23 np0005546909 python3.9[176577]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:23 np0005546909 python3.9[176729]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:24 np0005546909 python3.9[176881]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:25 np0005546909 python3.9[177033]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:25 np0005546909 python3.9[177185]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:26 np0005546909 python3.9[177337]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:27 np0005546909 python3.9[177489]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:52:27 np0005546909 python3.9[177641]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:52:28 np0005546909 python3.9[177793]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  5 06:52:29 np0005546909 python3.9[177945]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:52:29 np0005546909 systemd[1]: Reloading.
Dec  5 06:52:29 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:52:29 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:52:30 np0005546909 python3.9[178132]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:52:31 np0005546909 python3.9[178285]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:52:32 np0005546909 python3.9[178438]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:52:32 np0005546909 python3.9[178591]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:52:33 np0005546909 python3.9[178744]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:52:34 np0005546909 python3.9[178897]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:52:34 np0005546909 python3.9[179050]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:52:35 np0005546909 python3.9[179203]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:52:36 np0005546909 podman[179328]: 2025-12-05 11:52:36.720309051 +0000 UTC m=+0.063215673 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec  5 06:52:36 np0005546909 python3.9[179374]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:37 np0005546909 python3.9[179526]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:38 np0005546909 python3.9[179678]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:38 np0005546909 python3.9[179830]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:39 np0005546909 python3.9[179982]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:40 np0005546909 python3.9[180134]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:40 np0005546909 python3.9[180286]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:41 np0005546909 python3.9[180438]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:42 np0005546909 python3.9[180590]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:42 np0005546909 python3.9[180742]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:44 np0005546909 podman[180767]: 2025-12-05 11:52:44.251048544 +0000 UTC m=+0.098235752 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 06:52:47 np0005546909 python3.9[180920]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec  5 06:52:48 np0005546909 python3.9[181073]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  5 06:52:49 np0005546909 podman[181075]: 2025-12-05 11:52:49.250951353 +0000 UTC m=+0.091456017 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 06:52:50 np0005546909 python3.9[181250]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  5 06:52:52 np0005546909 systemd-logind[792]: New session 24 of user zuul.
Dec  5 06:52:52 np0005546909 systemd[1]: Started Session 24 of User zuul.
Dec  5 06:52:52 np0005546909 systemd[1]: session-24.scope: Deactivated successfully.
Dec  5 06:52:52 np0005546909 systemd-logind[792]: Session 24 logged out. Waiting for processes to exit.
Dec  5 06:52:52 np0005546909 systemd-logind[792]: Removed session 24.
Dec  5 06:52:53 np0005546909 python3.9[181436]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:52:54 np0005546909 python3.9[181557]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935572.9950733-1229-7226660787892/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:54 np0005546909 python3.9[181707]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:52:55 np0005546909 python3.9[181783]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:55 np0005546909 python3.9[181933]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:52:56 np0005546909 python3.9[182054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935575.1992695-1229-53174794787259/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:56 np0005546909 python3.9[182204]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:52:57 np0005546909 python3.9[182325]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935576.3250377-1229-103211430038952/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:58 np0005546909 python3.9[182475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:52:58 np0005546909 python3.9[182596]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935577.4364803-1229-122980577008084/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:52:59 np0005546909 python3.9[182746]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:52:59 np0005546909 python3.9[182867]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935578.7208652-1229-260771065739552/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:53:00 np0005546909 python3.9[183019]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:53:00 np0005546909 python3.9[183171]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:53:01 np0005546909 python3.9[183323]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:53:02 np0005546909 python3.9[183475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:53:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:53:02.998 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:53:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:53:02.999 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:53:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:53:02.999 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:53:03 np0005546909 python3.9[183598]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764935581.804586-1336-249519458313554/.source _original_basename=.layqsaty follow=False checksum=f14ce4c1d82487c4ae1e4905f59d714f2109e75f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec  5 06:53:03 np0005546909 python3.9[183750]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:53:04 np0005546909 python3.9[183902]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:53:05 np0005546909 python3.9[184023]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935584.0078566-1362-184769739173268/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:53:05 np0005546909 python3.9[184173]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:53:06 np0005546909 python3.9[184294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935585.2016935-1377-32762689516596/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:53:06 np0005546909 podman[184418]: 2025-12-05 11:53:06.943877587 +0000 UTC m=+0.068314259 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  5 06:53:07 np0005546909 python3.9[184459]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec  5 06:53:07 np0005546909 python3.9[184617]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 06:53:08 np0005546909 python3[184769]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 06:53:09 np0005546909 podman[184803]: 2025-12-05 11:53:08.988260274 +0000 UTC m=+0.019196615 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  5 06:53:09 np0005546909 podman[184803]: 2025-12-05 11:53:09.790222131 +0000 UTC m=+0.821158442 container create 8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec  5 06:53:09 np0005546909 python3[184769]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec  5 06:53:10 np0005546909 python3.9[184994]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:53:11 np0005546909 python3.9[185148]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec  5 06:53:12 np0005546909 python3.9[185300]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 06:53:13 np0005546909 python3[185452]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 06:53:13 np0005546909 podman[185492]: 2025-12-05 11:53:13.759116852 +0000 UTC m=+0.029290095 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  5 06:53:13 np0005546909 podman[185492]: 2025-12-05 11:53:13.954689067 +0000 UTC m=+0.224862290 container create 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 06:53:13 np0005546909 python3[185452]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec  5 06:53:14 np0005546909 podman[185650]: 2025-12-05 11:53:14.635909046 +0000 UTC m=+0.123368315 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  5 06:53:14 np0005546909 python3.9[185698]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:53:15 np0005546909 python3.9[185858]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:53:16 np0005546909 python3.9[186009]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935595.6853445-1469-89166165564689/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:53:16 np0005546909 python3.9[186085]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:53:16 np0005546909 systemd[1]: Reloading.
Dec  5 06:53:17 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:53:17 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:53:17 np0005546909 python3.9[186195]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:53:17 np0005546909 systemd[1]: Reloading.
Dec  5 06:53:17 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:53:17 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:53:18 np0005546909 systemd[1]: Starting nova_compute container...
Dec  5 06:53:18 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:53:18 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:18 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:18 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:18 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:18 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:18 np0005546909 podman[186234]: 2025-12-05 11:53:18.668720276 +0000 UTC m=+0.505320859 container init 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 06:53:18 np0005546909 podman[186234]: 2025-12-05 11:53:18.675285275 +0000 UTC m=+0.511885838 container start 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 06:53:18 np0005546909 nova_compute[186250]: + sudo -E kolla_set_configs
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Validating config file
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Copying service configuration files
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Deleting /etc/ceph
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Creating directory /etc/ceph
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Setting permission for /etc/ceph
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Writing out command to execute
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  5 06:53:18 np0005546909 nova_compute[186250]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  5 06:53:18 np0005546909 nova_compute[186250]: ++ cat /run_command
Dec  5 06:53:18 np0005546909 nova_compute[186250]: + CMD=nova-compute
Dec  5 06:53:18 np0005546909 nova_compute[186250]: + ARGS=
Dec  5 06:53:18 np0005546909 nova_compute[186250]: + sudo kolla_copy_cacerts
Dec  5 06:53:18 np0005546909 nova_compute[186250]: + [[ ! -n '' ]]
Dec  5 06:53:18 np0005546909 nova_compute[186250]: + . kolla_extend_start
Dec  5 06:53:18 np0005546909 nova_compute[186250]: Running command: 'nova-compute'
Dec  5 06:53:18 np0005546909 nova_compute[186250]: + echo 'Running command: '\''nova-compute'\'''
Dec  5 06:53:18 np0005546909 nova_compute[186250]: + umask 0022
Dec  5 06:53:18 np0005546909 nova_compute[186250]: + exec nova-compute
Dec  5 06:53:18 np0005546909 podman[186234]: nova_compute
Dec  5 06:53:18 np0005546909 systemd[1]: Started nova_compute container.
Dec  5 06:53:19 np0005546909 podman[186386]: 2025-12-05 11:53:19.652202366 +0000 UTC m=+0.055274713 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3)
Dec  5 06:53:19 np0005546909 python3.9[186429]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:53:20 np0005546909 python3.9[186582]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:53:20 np0005546909 nova_compute[186250]: 2025-12-05 11:53:20.809 186254 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  5 06:53:20 np0005546909 nova_compute[186250]: 2025-12-05 11:53:20.809 186254 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  5 06:53:20 np0005546909 nova_compute[186250]: 2025-12-05 11:53:20.810 186254 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  5 06:53:20 np0005546909 nova_compute[186250]: 2025-12-05 11:53:20.810 186254 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  5 06:53:20 np0005546909 nova_compute[186250]: 2025-12-05 11:53:20.950 186254 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:53:20 np0005546909 nova_compute[186250]: 2025-12-05 11:53:20.978 186254 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:53:20 np0005546909 nova_compute[186250]: 2025-12-05 11:53:20.979 186254 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  5 06:53:21 np0005546909 python3.9[186736]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.548 186254 INFO nova.virt.driver [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.662 186254 INFO nova.compute.provider_config [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.676 186254 DEBUG oslo_concurrency.lockutils [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.676 186254 DEBUG oslo_concurrency.lockutils [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.676 186254 DEBUG oslo_concurrency.lockutils [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.677 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.677 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.677 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.677 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.677 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.677 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.678 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.678 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.678 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.678 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.678 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.679 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.680 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.681 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.682 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.682 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.682 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.682 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.682 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.683 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.683 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.683 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.683 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.683 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.684 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.684 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.684 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.684 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.684 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.684 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.685 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.686 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.687 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.688 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.688 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.688 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.688 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.688 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.688 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.689 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.690 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.691 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.692 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.693 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.693 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.693 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.693 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.693 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.693 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.694 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.695 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.696 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.697 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.698 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.699 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.699 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.699 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.699 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.699 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.699 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.700 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.701 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.702 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.703 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.704 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.705 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.706 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.707 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.708 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.709 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.709 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.709 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.709 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.709 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.709 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.710 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.711 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.712 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.713 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.713 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.713 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.713 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.713 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.713 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.714 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.715 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.716 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.716 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.716 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.716 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.716 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.717 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.718 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.718 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.718 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.718 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.718 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.719 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.719 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.719 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.719 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.719 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.720 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.721 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.721 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.721 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.721 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.721 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.721 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.722 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.723 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.724 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.725 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.726 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.727 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.727 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.727 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.727 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.727 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.727 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.728 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.728 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.728 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.728 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.728 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.728 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.729 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.729 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.729 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.729 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.729 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.729 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.730 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.731 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.732 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.733 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.734 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.734 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.734 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.734 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.734 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.734 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.735 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.736 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.737 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.738 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.739 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.740 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.740 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.740 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.740 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.740 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.741 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.742 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.743 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.743 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.743 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.743 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.743 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.743 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.744 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.744 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.744 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.744 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.744 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.744 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.745 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.746 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.747 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.748 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.749 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.749 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.749 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.749 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.749 186254 WARNING oslo_config.cfg [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  5 06:53:21 np0005546909 nova_compute[186250]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  5 06:53:21 np0005546909 nova_compute[186250]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  5 06:53:21 np0005546909 nova_compute[186250]: and ``live_migration_inbound_addr`` respectively.
Dec  5 06:53:21 np0005546909 nova_compute[186250]: ).  Its value may be silently ignored in the future.#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.749 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.750 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.750 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.750 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.750 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.751 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.752 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.753 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.754 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.755 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.756 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.757 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.757 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.757 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.757 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.757 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.757 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.758 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.758 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.758 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.758 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.758 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.758 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.759 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.760 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.761 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.762 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.763 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.764 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.764 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.764 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.764 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.764 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.764 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.765 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.766 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.767 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.768 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.769 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.770 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.770 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.770 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.770 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.770 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.770 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.771 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.772 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.773 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.774 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.775 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.776 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.777 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.778 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.778 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.778 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.778 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.778 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.778 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.779 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.780 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.781 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.782 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.783 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.784 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.784 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.784 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.784 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.784 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.784 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.785 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.785 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.785 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.785 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.785 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.786 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.786 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.786 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.786 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.786 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.786 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.787 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.788 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.789 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.790 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.790 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.790 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.790 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.790 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.790 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.791 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.792 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.793 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.794 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.795 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.796 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.797 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.798 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.798 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.798 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.798 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.798 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.798 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.799 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.800 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.801 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.802 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.803 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.804 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.805 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.806 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.806 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.806 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.806 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.806 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.806 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.807 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.808 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.808 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.808 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.808 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.808 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.809 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.809 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.809 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.809 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.809 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.809 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.810 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.810 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.810 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.810 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.810 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.811 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.811 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.811 186254 DEBUG oslo_service.service [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.812 186254 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.824 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.825 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.825 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.825 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  5 06:53:21 np0005546909 systemd[1]: Starting libvirt QEMU daemon...
Dec  5 06:53:21 np0005546909 systemd[1]: Started libvirt QEMU daemon.
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.911 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f2035121cd0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.915 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f2035121cd0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.916 186254 INFO nova.virt.libvirt.driver [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.931 186254 WARNING nova.virt.libvirt.driver [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Dec  5 06:53:21 np0005546909 nova_compute[186250]: 2025-12-05 11:53:21.932 186254 DEBUG nova.virt.libvirt.volume.mount [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  5 06:53:22 np0005546909 python3.9[186940]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  5 06:53:22 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 06:53:22 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 06:53:22 np0005546909 nova_compute[186250]: 2025-12-05 11:53:22.792 186254 INFO nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Libvirt host capabilities <capabilities>
Dec  5 06:53:22 np0005546909 nova_compute[186250]: 
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <host>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <uuid>60bd4df1-481e-4d23-9585-8528ade5c2b1</uuid>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <cpu>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <arch>x86_64</arch>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model>EPYC-Rome-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <vendor>AMD</vendor>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <microcode version='16777317'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <signature family='23' model='49' stepping='0'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='x2apic'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='tsc-deadline'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='osxsave'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='hypervisor'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='tsc_adjust'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='spec-ctrl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='stibp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='arch-capabilities'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='ssbd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='cmp_legacy'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='topoext'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='virt-ssbd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='lbrv'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='tsc-scale'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='vmcb-clean'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='pause-filter'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='pfthreshold'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='svme-addr-chk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='rdctl-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='skip-l1dfl-vmentry'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='mds-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature name='pschange-mc-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <pages unit='KiB' size='4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <pages unit='KiB' size='2048'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <pages unit='KiB' size='1048576'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </cpu>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <power_management>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <suspend_mem/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <suspend_disk/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <suspend_hybrid/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </power_management>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <iommu support='no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <migration_features>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <live/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <uri_transports>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <uri_transport>tcp</uri_transport>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <uri_transport>rdma</uri_transport>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </uri_transports>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </migration_features>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <topology>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <cells num='1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <cell id='0'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:          <memory unit='KiB'>7864316</memory>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:          <pages unit='KiB' size='4'>1966079</pages>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:          <pages unit='KiB' size='2048'>0</pages>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:          <distances>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:            <sibling id='0' value='10'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:          </distances>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:          <cpus num='8'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:          </cpus>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        </cell>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </cells>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </topology>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <cache>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </cache>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <secmodel>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model>selinux</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <doi>0</doi>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </secmodel>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <secmodel>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model>dac</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <doi>0</doi>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </secmodel>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </host>
Dec  5 06:53:22 np0005546909 nova_compute[186250]: 
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <guest>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <os_type>hvm</os_type>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <arch name='i686'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <wordsize>32</wordsize>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <domain type='qemu'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <domain type='kvm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </arch>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <features>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <pae/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <nonpae/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <acpi default='on' toggle='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <apic default='on' toggle='no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <cpuselection/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <deviceboot/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <disksnapshot default='on' toggle='no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <externalSnapshot/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </features>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </guest>
Dec  5 06:53:22 np0005546909 nova_compute[186250]: 
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <guest>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <os_type>hvm</os_type>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <arch name='x86_64'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <wordsize>64</wordsize>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <domain type='qemu'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <domain type='kvm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </arch>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <features>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <acpi default='on' toggle='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <apic default='on' toggle='no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <cpuselection/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <deviceboot/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <disksnapshot default='on' toggle='no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <externalSnapshot/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </features>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </guest>
Dec  5 06:53:22 np0005546909 nova_compute[186250]: 
Dec  5 06:53:22 np0005546909 nova_compute[186250]: </capabilities>
Dec  5 06:53:22 np0005546909 nova_compute[186250]: #033[00m
Dec  5 06:53:22 np0005546909 nova_compute[186250]: 2025-12-05 11:53:22.801 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  5 06:53:22 np0005546909 nova_compute[186250]: 2025-12-05 11:53:22.823 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  5 06:53:22 np0005546909 nova_compute[186250]: <domainCapabilities>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <domain>kvm</domain>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <arch>i686</arch>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <vcpu max='4096'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <iothreads supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <os supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <enum name='firmware'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <loader supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>rom</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>pflash</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='readonly'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>yes</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>no</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='secure'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>no</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </loader>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </os>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <cpu>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <mode name='host-passthrough' supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='hostPassthroughMigratable'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>on</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>off</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <mode name='maximum' supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='maximumMigratable'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>on</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>off</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <mode name='host-model' supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <vendor>AMD</vendor>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='x2apic'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='hypervisor'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='stibp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='ssbd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='overflow-recov'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='succor'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='ibrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='lbrv'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='tsc-scale'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='flushbyasid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='pause-filter'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='pfthreshold'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='disable' name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <mode name='custom' supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-noTSX'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cooperlake'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cooperlake-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cooperlake-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Denverton'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Denverton-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Denverton-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Denverton-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Dhyana-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Genoa'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amd-psfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='auto-ibrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='stibp-always-on'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amd-psfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='auto-ibrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='stibp-always-on'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Milan'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Milan-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Milan-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amd-psfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='stibp-always-on'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='GraniteRapids'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='prefetchiti'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='GraniteRapids-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='prefetchiti'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='GraniteRapids-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx10'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx10-128'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx10-256'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx10-512'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='prefetchiti'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-noTSX'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v5'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v6'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v7'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='KnightsMill'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512er'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512pf'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='KnightsMill-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512er'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512pf'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G4-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G5'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tbm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G5-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tbm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SierraForest'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cmpccxadd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SierraForest-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cmpccxadd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v5'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='athlon'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='athlon-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='core2duo'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='core2duo-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='coreduo'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='coreduo-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='n270'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='n270-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='phenom'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='phenom-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </cpu>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <memoryBacking supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <enum name='sourceType'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>file</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>anonymous</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>memfd</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </memoryBacking>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <devices>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <disk supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='diskDevice'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>disk</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>cdrom</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>floppy</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>lun</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='bus'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>fdc</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>scsi</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>usb</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>sata</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio-transitional</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio-non-transitional</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </disk>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <graphics supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>vnc</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>egl-headless</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>dbus</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </graphics>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <video supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='modelType'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>vga</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>cirrus</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>none</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>bochs</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>ramfb</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </video>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <hostdev supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='mode'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>subsystem</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='startupPolicy'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>default</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>mandatory</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>requisite</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>optional</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='subsysType'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>usb</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>pci</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>scsi</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='capsType'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='pciBackend'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </hostdev>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <rng supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio-transitional</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio-non-transitional</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='backendModel'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>random</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>egd</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>builtin</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </rng>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <filesystem supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='driverType'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>path</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>handle</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtiofs</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </filesystem>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <tpm supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>tpm-tis</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>tpm-crb</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='backendModel'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>emulator</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>external</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='backendVersion'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>2.0</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </tpm>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <redirdev supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='bus'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>usb</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </redirdev>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <channel supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>pty</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>unix</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </channel>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <crypto supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='model'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>qemu</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='backendModel'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>builtin</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </crypto>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <interface supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='backendType'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>default</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>passt</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </interface>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <panic supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>isa</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>hyperv</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </panic>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <console supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>null</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>vc</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>pty</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>dev</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>file</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>pipe</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>stdio</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>udp</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>tcp</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>unix</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>qemu-vdagent</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>dbus</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </console>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </devices>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <features>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <gic supported='no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <vmcoreinfo supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <genid supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <backingStoreInput supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <backup supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <async-teardown supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <ps2 supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <sev supported='no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <sgx supported='no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <hyperv supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='features'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>relaxed</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>vapic</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>spinlocks</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>vpindex</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>runtime</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>synic</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>stimer</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>reset</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>vendor_id</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>frequencies</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>reenlightenment</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>tlbflush</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>ipi</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>avic</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>emsr_bitmap</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>xmm_input</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <defaults>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <spinlocks>4095</spinlocks>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <stimer_direct>on</stimer_direct>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </defaults>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </hyperv>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <launchSecurity supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='sectype'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>tdx</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </launchSecurity>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </features>
Dec  5 06:53:22 np0005546909 nova_compute[186250]: </domainCapabilities>
Dec  5 06:53:22 np0005546909 nova_compute[186250]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 06:53:22 np0005546909 nova_compute[186250]: 2025-12-05 11:53:22.831 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  5 06:53:22 np0005546909 nova_compute[186250]: <domainCapabilities>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <domain>kvm</domain>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <arch>i686</arch>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <vcpu max='240'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <iothreads supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <os supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <enum name='firmware'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <loader supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>rom</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>pflash</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='readonly'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>yes</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>no</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='secure'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>no</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </loader>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </os>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <cpu>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <mode name='host-passthrough' supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='hostPassthroughMigratable'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>on</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>off</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <mode name='maximum' supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='maximumMigratable'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>on</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>off</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <mode name='host-model' supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <vendor>AMD</vendor>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='x2apic'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='hypervisor'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='stibp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='ssbd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='overflow-recov'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='succor'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='ibrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='lbrv'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='tsc-scale'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='flushbyasid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='pause-filter'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='pfthreshold'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='disable' name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <mode name='custom' supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-noTSX'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cooperlake'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cooperlake-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cooperlake-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Denverton'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Denverton-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Denverton-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Denverton-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Dhyana-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Genoa'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amd-psfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='auto-ibrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='stibp-always-on'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amd-psfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='auto-ibrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='stibp-always-on'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Milan'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Milan-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Milan-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amd-psfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='stibp-always-on'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='GraniteRapids'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='prefetchiti'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='GraniteRapids-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='prefetchiti'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='GraniteRapids-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx10'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx10-128'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx10-256'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx10-512'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='prefetchiti'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-noTSX'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v5'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v6'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v7'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='KnightsMill'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512er'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512pf'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='KnightsMill-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512er'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512pf'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G4-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G5'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tbm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G5-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tbm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SierraForest'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cmpccxadd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SierraForest-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cmpccxadd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v5'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='athlon'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='athlon-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='core2duo'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='core2duo-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='coreduo'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='coreduo-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='n270'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='n270-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='phenom'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='phenom-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </cpu>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <memoryBacking supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <enum name='sourceType'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>file</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>anonymous</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>memfd</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </memoryBacking>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <devices>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <disk supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='diskDevice'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>disk</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>cdrom</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>floppy</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>lun</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='bus'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>ide</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>fdc</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>scsi</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>usb</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>sata</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio-transitional</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio-non-transitional</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </disk>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <graphics supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>vnc</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>egl-headless</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>dbus</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </graphics>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <video supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='modelType'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>vga</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>cirrus</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>none</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>bochs</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>ramfb</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </video>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <hostdev supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='mode'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>subsystem</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='startupPolicy'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>default</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>mandatory</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>requisite</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>optional</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='subsysType'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>usb</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>pci</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>scsi</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='capsType'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='pciBackend'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </hostdev>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <rng supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio-transitional</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtio-non-transitional</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='backendModel'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>random</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>egd</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>builtin</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </rng>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <filesystem supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='driverType'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>path</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>handle</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>virtiofs</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </filesystem>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <tpm supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>tpm-tis</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>tpm-crb</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='backendModel'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>emulator</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>external</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='backendVersion'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>2.0</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </tpm>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <redirdev supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='bus'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>usb</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </redirdev>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <channel supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>pty</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>unix</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </channel>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <crypto supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='model'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>qemu</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='backendModel'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>builtin</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </crypto>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <interface supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='backendType'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>default</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>passt</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </interface>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <panic supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>isa</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>hyperv</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </panic>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <console supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>null</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>vc</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>pty</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>dev</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>file</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>pipe</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>stdio</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>udp</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>tcp</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>unix</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>qemu-vdagent</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>dbus</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </console>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </devices>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <features>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <gic supported='no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <vmcoreinfo supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <genid supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <backingStoreInput supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <backup supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <async-teardown supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <ps2 supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <sev supported='no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <sgx supported='no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <hyperv supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='features'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>relaxed</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>vapic</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>spinlocks</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>vpindex</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>runtime</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>synic</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>stimer</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>reset</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>vendor_id</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>frequencies</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>reenlightenment</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>tlbflush</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>ipi</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>avic</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>emsr_bitmap</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>xmm_input</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <defaults>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <spinlocks>4095</spinlocks>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <stimer_direct>on</stimer_direct>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </defaults>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </hyperv>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <launchSecurity supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='sectype'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>tdx</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </launchSecurity>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </features>
Dec  5 06:53:22 np0005546909 nova_compute[186250]: </domainCapabilities>
Dec  5 06:53:22 np0005546909 nova_compute[186250]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 06:53:22 np0005546909 nova_compute[186250]: 2025-12-05 11:53:22.856 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  5 06:53:22 np0005546909 nova_compute[186250]: 2025-12-05 11:53:22.860 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  5 06:53:22 np0005546909 nova_compute[186250]: <domainCapabilities>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <domain>kvm</domain>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <arch>x86_64</arch>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <vcpu max='4096'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <iothreads supported='yes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <os supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <enum name='firmware'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>efi</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <loader supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>rom</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>pflash</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='readonly'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>yes</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>no</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='secure'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>yes</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>no</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </loader>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  </os>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:  <cpu>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <mode name='host-passthrough' supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='hostPassthroughMigratable'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>on</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>off</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <mode name='maximum' supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <enum name='maximumMigratable'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>on</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <value>off</value>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <mode name='host-model' supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <vendor>AMD</vendor>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='x2apic'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='hypervisor'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='stibp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='ssbd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='overflow-recov'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='succor'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='ibrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='lbrv'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='tsc-scale'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='flushbyasid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='pause-filter'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='pfthreshold'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <feature policy='disable' name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:    <mode name='custom' supported='yes'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-noTSX'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cooperlake'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cooperlake-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Cooperlake-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Denverton'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Denverton-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Denverton-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Denverton-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Dhyana-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Genoa'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amd-psfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='auto-ibrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='stibp-always-on'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amd-psfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='auto-ibrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='stibp-always-on'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Milan'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Milan-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Milan-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amd-psfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='stibp-always-on'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='EPYC-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='GraniteRapids'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='prefetchiti'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='GraniteRapids-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='prefetchiti'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='GraniteRapids-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx10'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx10-128'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx10-256'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx10-512'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='prefetchiti'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-noTSX'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v5'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v6'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v7'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='KnightsMill'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512er'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512pf'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='KnightsMill-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512er'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512pf'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G4-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G5'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tbm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G5-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tbm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SierraForest'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cmpccxadd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='SierraForest-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-ifma'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cmpccxadd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v4'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v5'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v1'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v2'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v3'>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:22 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v4'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='athlon'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='athlon-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='core2duo'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='core2duo-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='coreduo'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='coreduo-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='n270'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='n270-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='phenom'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='phenom-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  </cpu>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <memoryBacking supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <enum name='sourceType'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <value>file</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <value>anonymous</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <value>memfd</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  </memoryBacking>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <devices>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <disk supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='diskDevice'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>disk</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>cdrom</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>floppy</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>lun</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='bus'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>fdc</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>scsi</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>usb</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>sata</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio-transitional</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio-non-transitional</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </disk>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <graphics supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>vnc</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>egl-headless</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>dbus</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </graphics>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <video supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='modelType'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>vga</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>cirrus</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>none</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>bochs</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>ramfb</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </video>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <hostdev supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='mode'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>subsystem</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='startupPolicy'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>default</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>mandatory</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>requisite</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>optional</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='subsysType'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>usb</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>pci</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>scsi</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='capsType'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='pciBackend'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </hostdev>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <rng supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio-transitional</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio-non-transitional</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='backendModel'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>random</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>egd</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>builtin</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </rng>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <filesystem supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='driverType'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>path</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>handle</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtiofs</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </filesystem>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <tpm supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>tpm-tis</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>tpm-crb</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='backendModel'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>emulator</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>external</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='backendVersion'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>2.0</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </tpm>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <redirdev supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='bus'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>usb</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </redirdev>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <channel supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>pty</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>unix</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </channel>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <crypto supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='model'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>qemu</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='backendModel'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>builtin</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </crypto>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <interface supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='backendType'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>default</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>passt</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </interface>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <panic supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>isa</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>hyperv</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </panic>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <console supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>null</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>vc</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>pty</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>dev</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>file</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>pipe</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>stdio</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>udp</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>tcp</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>unix</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>qemu-vdagent</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>dbus</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </console>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  </devices>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <features>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <gic supported='no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <vmcoreinfo supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <genid supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <backingStoreInput supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <backup supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <async-teardown supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <ps2 supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <sev supported='no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <sgx supported='no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <hyperv supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='features'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>relaxed</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>vapic</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>spinlocks</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>vpindex</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>runtime</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>synic</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>stimer</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>reset</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>vendor_id</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>frequencies</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>reenlightenment</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>tlbflush</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>ipi</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>avic</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>emsr_bitmap</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>xmm_input</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <defaults>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <spinlocks>4095</spinlocks>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <stimer_direct>on</stimer_direct>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </defaults>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </hyperv>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <launchSecurity supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='sectype'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>tdx</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </launchSecurity>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  </features>
Dec  5 06:53:23 np0005546909 nova_compute[186250]: </domainCapabilities>
Dec  5 06:53:23 np0005546909 nova_compute[186250]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:22.920 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  5 06:53:23 np0005546909 nova_compute[186250]: <domainCapabilities>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <domain>kvm</domain>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <arch>x86_64</arch>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <vcpu max='240'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <iothreads supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <os supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <enum name='firmware'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <loader supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>rom</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>pflash</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='readonly'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>yes</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>no</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='secure'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>no</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </loader>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  </os>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <cpu>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <mode name='host-passthrough' supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='hostPassthroughMigratable'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>on</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>off</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <mode name='maximum' supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='maximumMigratable'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>on</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>off</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <mode name='host-model' supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <vendor>AMD</vendor>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='x2apic'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='hypervisor'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='stibp'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='ssbd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='overflow-recov'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='succor'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='ibrs'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='lbrv'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='tsc-scale'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='flushbyasid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='pause-filter'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='pfthreshold'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <feature policy='disable' name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <mode name='custom' supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Broadwell'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-IBRS'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-noTSX'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v3'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Broadwell-v4'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Cooperlake'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Cooperlake-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Cooperlake-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Denverton'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Denverton-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Denverton-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Denverton-v3'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Dhyana-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Genoa'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amd-psfd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='auto-ibrs'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='stibp-always-on'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amd-psfd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='auto-ibrs'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='stibp-always-on'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Milan'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Milan-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Milan-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amd-psfd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='stibp-always-on'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='EPYC-Rome-v3'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='EPYC-v3'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='EPYC-v4'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='GraniteRapids'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-fp16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='prefetchiti'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='GraniteRapids-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-fp16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='prefetchiti'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='GraniteRapids-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-fp16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx10'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx10-128'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx10-256'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx10-512'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='prefetchiti'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Haswell'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Haswell-IBRS'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Haswell-noTSX'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v3'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Haswell-v4'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v3'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v4'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v5'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v6'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Icelake-Server-v7'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge-IBRS'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='IvyBridge-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='KnightsMill'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512er'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512pf'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='KnightsMill-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512er'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512pf'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G4'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G4-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G5'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='tbm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Opteron_G5-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fma4'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='tbm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xop'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='SapphireRapids-v3'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-int8'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='amx-tile'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-bf16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-fp16'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bitalg'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrc'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fzrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='la57'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='taa-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xfd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='SierraForest'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='cmpccxadd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='SierraForest-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-ifma'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-vnni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='cmpccxadd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fbsdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='fsrs'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ibrs-all'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='mcdt-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pbrsb-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='psdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='serialize'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vaes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v3'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Client-v4'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='hle'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='rtm'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v3'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v4'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Skylake-Server-v5'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512bw'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512cd'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512dq'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512f'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='avx512vl'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='invpcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pcid'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='pku'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Snowridge'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='mpx'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v2'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v3'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='core-capability'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='split-lock-detect'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='Snowridge-v4'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='cldemote'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='erms'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='gfni'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdir64b'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='movdiri'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='xsaves'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='athlon'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='athlon-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='core2duo'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='core2duo-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='coreduo'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='coreduo-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='n270'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='n270-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='ss'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='phenom'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <blockers model='phenom-v1'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnow'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <feature name='3dnowext'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </blockers>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </mode>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  </cpu>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <memoryBacking supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <enum name='sourceType'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <value>file</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <value>anonymous</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <value>memfd</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  </memoryBacking>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <devices>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <disk supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='diskDevice'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>disk</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>cdrom</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>floppy</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>lun</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='bus'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>ide</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>fdc</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>scsi</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>usb</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>sata</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio-transitional</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio-non-transitional</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </disk>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <graphics supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>vnc</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>egl-headless</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>dbus</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </graphics>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <video supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='modelType'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>vga</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>cirrus</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>none</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>bochs</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>ramfb</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </video>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <hostdev supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='mode'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>subsystem</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='startupPolicy'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>default</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>mandatory</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>requisite</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>optional</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='subsysType'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>usb</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>pci</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>scsi</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='capsType'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='pciBackend'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </hostdev>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <rng supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio-transitional</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtio-non-transitional</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='backendModel'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>random</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>egd</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>builtin</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </rng>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <filesystem supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='driverType'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>path</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>handle</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>virtiofs</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </filesystem>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <tpm supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>tpm-tis</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>tpm-crb</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='backendModel'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>emulator</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>external</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='backendVersion'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>2.0</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </tpm>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <redirdev supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='bus'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>usb</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </redirdev>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <channel supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>pty</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>unix</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </channel>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <crypto supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='model'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>qemu</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='backendModel'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>builtin</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </crypto>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <interface supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='backendType'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>default</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>passt</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </interface>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <panic supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='model'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>isa</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>hyperv</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </panic>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <console supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='type'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>null</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>vc</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>pty</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>dev</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>file</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>pipe</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>stdio</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>udp</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>tcp</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>unix</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>qemu-vdagent</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>dbus</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </console>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  </devices>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  <features>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <gic supported='no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <vmcoreinfo supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <genid supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <backingStoreInput supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <backup supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <async-teardown supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <ps2 supported='yes'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <sev supported='no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <sgx supported='no'/>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <hyperv supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='features'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>relaxed</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>vapic</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>spinlocks</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>vpindex</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>runtime</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>synic</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>stimer</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>reset</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>vendor_id</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>frequencies</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>reenlightenment</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>tlbflush</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>ipi</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>avic</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>emsr_bitmap</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>xmm_input</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <defaults>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <spinlocks>4095</spinlocks>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <stimer_direct>on</stimer_direct>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </defaults>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </hyperv>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    <launchSecurity supported='yes'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      <enum name='sectype'>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:        <value>tdx</value>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:      </enum>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:    </launchSecurity>
Dec  5 06:53:23 np0005546909 nova_compute[186250]:  </features>
Dec  5 06:53:23 np0005546909 nova_compute[186250]: </domainCapabilities>
Dec  5 06:53:23 np0005546909 nova_compute[186250]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:22.982 186254 DEBUG nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:22.983 186254 INFO nova.virt.libvirt.host [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Secure Boot support detected#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:22.984 186254 INFO nova.virt.libvirt.driver [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:22.984 186254 INFO nova.virt.libvirt.driver [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:22.993 186254 DEBUG nova.virt.libvirt.driver [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.026 186254 INFO nova.virt.node [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Determined node identity 5111707b-bdc3-4252-b5b7-b3e96ff05344 from /var/lib/nova/compute_id#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.046 186254 WARNING nova.compute.manager [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Compute nodes ['5111707b-bdc3-4252-b5b7-b3e96ff05344'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.077 186254 INFO nova.compute.manager [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.108 186254 WARNING nova.compute.manager [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.108 186254 DEBUG oslo_concurrency.lockutils [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.109 186254 DEBUG oslo_concurrency.lockutils [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.109 186254 DEBUG oslo_concurrency.lockutils [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.109 186254 DEBUG nova.compute.resource_tracker [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 06:53:23 np0005546909 systemd[1]: Starting libvirt nodedev daemon...
Dec  5 06:53:23 np0005546909 systemd[1]: Started libvirt nodedev daemon.
Dec  5 06:53:23 np0005546909 python3.9[187123]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:53:23 np0005546909 systemd[1]: Stopping nova_compute container...
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.416 186254 WARNING nova.virt.libvirt.driver [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.417 186254 DEBUG nova.compute.resource_tracker [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6187MB free_disk=73.5449333190918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.417 186254 DEBUG oslo_concurrency.lockutils [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.417 186254 DEBUG oslo_concurrency.lockutils [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.431 186254 WARNING nova.compute.resource_tracker [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] No compute node record for compute-0.ctlplane.example.com:5111707b-bdc3-4252-b5b7-b3e96ff05344: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 5111707b-bdc3-4252-b5b7-b3e96ff05344 could not be found.#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.444 186254 DEBUG oslo_concurrency.lockutils [None req-6dca1851-6808-4314-bb94-095a41dcc121 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.444 186254 DEBUG oslo_concurrency.lockutils [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.445 186254 DEBUG oslo_concurrency.lockutils [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:53:23 np0005546909 nova_compute[186250]: 2025-12-05 11:53:23.445 186254 DEBUG oslo_concurrency.lockutils [None req-437aecb3-8834-418e-8d6b-509e0c70f2c3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:53:23 np0005546909 virtqemud[186841]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec  5 06:53:23 np0005546909 virtqemud[186841]: hostname: compute-0
Dec  5 06:53:23 np0005546909 virtqemud[186841]: End of file while reading data: Input/output error
Dec  5 06:53:23 np0005546909 systemd[1]: libpod-5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9.scope: Deactivated successfully.
Dec  5 06:53:23 np0005546909 systemd[1]: libpod-5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9.scope: Consumed 3.229s CPU time.
Dec  5 06:53:23 np0005546909 podman[187150]: 2025-12-05 11:53:23.915648255 +0000 UTC m=+0.562184381 container died 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm)
Dec  5 06:53:24 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9-userdata-shm.mount: Deactivated successfully.
Dec  5 06:53:24 np0005546909 systemd[1]: var-lib-containers-storage-overlay-895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41-merged.mount: Deactivated successfully.
Dec  5 06:53:25 np0005546909 podman[187150]: 2025-12-05 11:53:25.538171747 +0000 UTC m=+2.184707863 container cleanup 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute)
Dec  5 06:53:25 np0005546909 podman[187150]: nova_compute
Dec  5 06:53:25 np0005546909 podman[187180]: nova_compute
Dec  5 06:53:25 np0005546909 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec  5 06:53:25 np0005546909 systemd[1]: Stopped nova_compute container.
Dec  5 06:53:25 np0005546909 systemd[1]: Starting nova_compute container...
Dec  5 06:53:27 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:53:27 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:27 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:27 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:27 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:27 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895761a5f94a0efd0d3f654dc53155bcfac1c3ee5a7632b4a76fb5cb10e0de41/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:27 np0005546909 podman[187193]: 2025-12-05 11:53:27.83718309 +0000 UTC m=+2.198793707 container init 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 06:53:27 np0005546909 podman[187193]: 2025-12-05 11:53:27.84791605 +0000 UTC m=+2.209526607 container start 5abe0874e2e06eb6e27823f81db081feeb7715aedc0d9cb9a18d4e737c4eccc9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  5 06:53:27 np0005546909 nova_compute[187208]: + sudo -E kolla_set_configs
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Validating config file
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Copying service configuration files
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Deleting /etc/ceph
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Creating directory /etc/ceph
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Setting permission for /etc/ceph
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Writing out command to execute
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  5 06:53:27 np0005546909 nova_compute[187208]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  5 06:53:27 np0005546909 nova_compute[187208]: ++ cat /run_command
Dec  5 06:53:27 np0005546909 nova_compute[187208]: + CMD=nova-compute
Dec  5 06:53:27 np0005546909 nova_compute[187208]: + ARGS=
Dec  5 06:53:27 np0005546909 nova_compute[187208]: + sudo kolla_copy_cacerts
Dec  5 06:53:27 np0005546909 nova_compute[187208]: + [[ ! -n '' ]]
Dec  5 06:53:27 np0005546909 nova_compute[187208]: + . kolla_extend_start
Dec  5 06:53:27 np0005546909 nova_compute[187208]: Running command: 'nova-compute'
Dec  5 06:53:27 np0005546909 nova_compute[187208]: + echo 'Running command: '\''nova-compute'\'''
Dec  5 06:53:27 np0005546909 nova_compute[187208]: + umask 0022
Dec  5 06:53:27 np0005546909 nova_compute[187208]: + exec nova-compute
Dec  5 06:53:28 np0005546909 podman[187193]: nova_compute
Dec  5 06:53:28 np0005546909 systemd[1]: Started nova_compute container.
Dec  5 06:53:28 np0005546909 python3.9[187371]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  5 06:53:29 np0005546909 systemd[1]: Started libpod-conmon-8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9.scope.
Dec  5 06:53:29 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:53:29 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd705a07c7bc29a9eafa697375b74dc2ecbeb5bf38a91f03c149fb0a99e25516/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:29 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd705a07c7bc29a9eafa697375b74dc2ecbeb5bf38a91f03c149fb0a99e25516/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:29 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd705a07c7bc29a9eafa697375b74dc2ecbeb5bf38a91f03c149fb0a99e25516/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec  5 06:53:29 np0005546909 podman[187397]: 2025-12-05 11:53:29.610375543 +0000 UTC m=+0.543119359 container init 8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=nova_compute_init, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 06:53:29 np0005546909 podman[187397]: 2025-12-05 11:53:29.622654287 +0000 UTC m=+0.555398053 container start 8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Applying nova statedir ownership
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec  5 06:53:29 np0005546909 nova_compute_init[187418]: INFO:nova_statedir:Nova statedir ownership complete
Dec  5 06:53:29 np0005546909 systemd[1]: libpod-8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9.scope: Deactivated successfully.
Dec  5 06:53:29 np0005546909 nova_compute[187208]: 2025-12-05 11:53:29.895 187212 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  5 06:53:29 np0005546909 nova_compute[187208]: 2025-12-05 11:53:29.895 187212 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  5 06:53:29 np0005546909 nova_compute[187208]: 2025-12-05 11:53:29.895 187212 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  5 06:53:29 np0005546909 nova_compute[187208]: 2025-12-05 11:53:29.896 187212 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  5 06:53:29 np0005546909 python3.9[187371]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec  5 06:53:30 np0005546909 podman[187432]: 2025-12-05 11:53:30.002960015 +0000 UTC m=+0.023893159 container died 8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.028 187212 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.050 187212 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.051 187212 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.542 187212 INFO nova.virt.driver [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.665 187212 INFO nova.compute.provider_config [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.683 187212 DEBUG oslo_concurrency.lockutils [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.683 187212 DEBUG oslo_concurrency.lockutils [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.683 187212 DEBUG oslo_concurrency.lockutils [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.683 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.684 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.684 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.684 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.684 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.684 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.684 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.685 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.686 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.686 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.686 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.686 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.686 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.686 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.687 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.688 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.688 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.688 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.688 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.688 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.688 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.689 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.690 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.690 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.690 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.690 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.690 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.690 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.691 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.692 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.693 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.694 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.695 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.696 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.697 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.698 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.699 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.700 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.701 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.702 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.703 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.704 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.705 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.706 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.707 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.708 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.708 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.708 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.708 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.708 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.708 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.709 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.709 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.709 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.709 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.709 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.709 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.710 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.711 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.712 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.713 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.714 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.714 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.714 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.714 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.714 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.714 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.715 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.716 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.717 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.718 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.719 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.720 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.721 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.722 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.722 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.722 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.722 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.722 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.722 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.723 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.724 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.724 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.724 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.724 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.724 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.724 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.725 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.725 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.725 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.725 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.725 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.725 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.726 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.727 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.728 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.728 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.728 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.728 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.728 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.728 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.729 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.729 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.729 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.729 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.729 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.730 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.731 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.732 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.733 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.733 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.733 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.733 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.733 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.733 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.734 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.734 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.734 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.734 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.734 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.734 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.735 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.735 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.735 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.735 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.735 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.735 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.736 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.737 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.738 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.739 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.740 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.741 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.742 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.743 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.744 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.745 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.746 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.747 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.747 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.747 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.747 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.747 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.747 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.748 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.749 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.750 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.750 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.750 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.750 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.750 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.750 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.751 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.752 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.753 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.753 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.753 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.753 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.753 187212 WARNING oslo_config.cfg [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  5 06:53:30 np0005546909 nova_compute[187208]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  5 06:53:30 np0005546909 nova_compute[187208]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  5 06:53:30 np0005546909 nova_compute[187208]: and ``live_migration_inbound_addr`` respectively.
Dec  5 06:53:30 np0005546909 nova_compute[187208]: ).  Its value may be silently ignored in the future.#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.753 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.754 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.755 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.756 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.757 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.758 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.758 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.758 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.758 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.758 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.758 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.759 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.760 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.760 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.760 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.760 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.760 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.760 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.761 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.762 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.763 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.764 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.765 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.765 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.765 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.765 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.765 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.765 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.766 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.767 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.768 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.769 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.770 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.771 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.772 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.772 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.772 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.772 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.772 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.772 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.773 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.774 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.774 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.774 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.774 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.774 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.774 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.775 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.775 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.775 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.775 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.775 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.775 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.776 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.777 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.778 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.779 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.780 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.781 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.782 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.783 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.784 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.785 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.786 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.787 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.787 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.787 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.787 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.787 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.787 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.788 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.789 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.790 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.790 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.790 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.790 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.790 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.791 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.791 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.791 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.791 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.791 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.792 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.792 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.792 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.792 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.792 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.793 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.794 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.795 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.796 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.797 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.798 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.799 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.800 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.801 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.802 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.803 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.804 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.805 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.806 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.807 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.808 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.808 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.808 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.808 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.808 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.808 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.809 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.809 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.809 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.809 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.809 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.809 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.810 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.811 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.812 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.813 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.814 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.815 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.815 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.815 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.815 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.815 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.815 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.816 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.816 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.816 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.816 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.816 187212 DEBUG oslo_service.service [None req-5ff656f4-8fd2-473e-a3da-f1ebf6979db2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.817 187212 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.831 187212 INFO nova.virt.node [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Determined node identity 5111707b-bdc3-4252-b5b7-b3e96ff05344 from /var/lib/nova/compute_id#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.832 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.832 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.832 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.833 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.847 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd351eec4f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.850 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd351eec4f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.851 187212 INFO nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.865 187212 DEBUG nova.virt.libvirt.volume.mount [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.865 187212 INFO nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Libvirt host capabilities <capabilities>
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <host>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <uuid>60bd4df1-481e-4d23-9585-8528ade5c2b1</uuid>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <cpu>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <arch>x86_64</arch>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model>EPYC-Rome-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <vendor>AMD</vendor>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <microcode version='16777317'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <signature family='23' model='49' stepping='0'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='x2apic'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='tsc-deadline'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='osxsave'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='hypervisor'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='tsc_adjust'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='spec-ctrl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='stibp'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='arch-capabilities'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='ssbd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='cmp_legacy'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='topoext'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='virt-ssbd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='lbrv'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='tsc-scale'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='vmcb-clean'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='pause-filter'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='pfthreshold'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='svme-addr-chk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='rdctl-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='skip-l1dfl-vmentry'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='mds-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature name='pschange-mc-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <pages unit='KiB' size='4'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <pages unit='KiB' size='2048'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <pages unit='KiB' size='1048576'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </cpu>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <power_management>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <suspend_mem/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <suspend_disk/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <suspend_hybrid/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </power_management>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <iommu support='no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <migration_features>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <live/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <uri_transports>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <uri_transport>tcp</uri_transport>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <uri_transport>rdma</uri_transport>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </uri_transports>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </migration_features>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <topology>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <cells num='1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <cell id='0'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:          <memory unit='KiB'>7864316</memory>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:          <pages unit='KiB' size='4'>1966079</pages>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:          <pages unit='KiB' size='2048'>0</pages>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:          <distances>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:            <sibling id='0' value='10'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:          </distances>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:          <cpus num='8'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:          </cpus>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        </cell>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </cells>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </topology>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <cache>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </cache>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <secmodel>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model>selinux</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <doi>0</doi>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </secmodel>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <secmodel>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model>dac</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <doi>0</doi>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </secmodel>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  </host>
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <guest>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <os_type>hvm</os_type>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <arch name='i686'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <wordsize>32</wordsize>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <domain type='qemu'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <domain type='kvm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </arch>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <features>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <pae/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <nonpae/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <acpi default='on' toggle='yes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <apic default='on' toggle='no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <cpuselection/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <deviceboot/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <disksnapshot default='on' toggle='no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <externalSnapshot/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </features>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  </guest>
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <guest>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <os_type>hvm</os_type>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <arch name='x86_64'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <wordsize>64</wordsize>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <domain type='qemu'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <domain type='kvm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </arch>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <features>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <acpi default='on' toggle='yes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <apic default='on' toggle='no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <cpuselection/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <deviceboot/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <disksnapshot default='on' toggle='no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <externalSnapshot/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </features>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  </guest>
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 
Dec  5 06:53:30 np0005546909 nova_compute[187208]: </capabilities>
Dec  5 06:53:30 np0005546909 nova_compute[187208]: #033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.872 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.875 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  5 06:53:30 np0005546909 nova_compute[187208]: <domainCapabilities>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <domain>kvm</domain>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <arch>i686</arch>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <vcpu max='240'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <iothreads supported='yes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <os supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <enum name='firmware'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <loader supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>rom</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>pflash</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='readonly'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>yes</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>no</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='secure'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>no</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </loader>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <cpu>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <mode name='host-passthrough' supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='hostPassthroughMigratable'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>on</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>off</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <mode name='maximum' supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='maximumMigratable'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>on</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>off</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <mode name='host-model' supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <vendor>AMD</vendor>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='x2apic'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='hypervisor'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='stibp'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='ssbd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='overflow-recov'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='succor'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='ibrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='lbrv'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='tsc-scale'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='flushbyasid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='pause-filter'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='pfthreshold'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='disable' name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <mode name='custom' supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-noTSX'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cooperlake'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cooperlake-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cooperlake-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Denverton'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Denverton-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Denverton-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Denverton-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Dhyana-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Genoa'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amd-psfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='auto-ibrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='stibp-always-on'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amd-psfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='auto-ibrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='stibp-always-on'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Milan'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Milan-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Milan-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amd-psfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='stibp-always-on'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-v4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='GraniteRapids'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='prefetchiti'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='GraniteRapids-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='prefetchiti'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='GraniteRapids-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx10'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx10-128'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx10-256'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx10-512'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='prefetchiti'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-noTSX'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v5'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v6'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v7'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='KnightsMill'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512er'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512pf'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='KnightsMill-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512er'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512pf'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G4-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G5'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='tbm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G5-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='tbm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='SierraForest'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='cmpccxadd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='SierraForest-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='cmpccxadd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v5'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Snowridge'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='athlon'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='athlon-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='core2duo'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='core2duo-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='coreduo'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='coreduo-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='n270'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='n270-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='phenom'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='phenom-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <memoryBacking supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <enum name='sourceType'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <value>file</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <value>anonymous</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <value>memfd</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  </memoryBacking>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <disk supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='diskDevice'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>disk</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>cdrom</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>floppy</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>lun</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='bus'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>ide</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>fdc</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>scsi</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>usb</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>sata</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>virtio-transitional</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>virtio-non-transitional</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <graphics supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>vnc</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>egl-headless</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>dbus</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </graphics>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <video supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='modelType'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>vga</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>cirrus</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>none</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>bochs</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>ramfb</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <hostdev supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='mode'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>subsystem</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='startupPolicy'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>default</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>mandatory</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>requisite</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>optional</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='subsysType'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>usb</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>pci</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>scsi</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='capsType'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='pciBackend'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </hostdev>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <rng supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>virtio-transitional</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>virtio-non-transitional</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='backendModel'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>random</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>egd</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>builtin</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <filesystem supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='driverType'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>path</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>handle</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>virtiofs</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </filesystem>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <tpm supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>tpm-tis</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>tpm-crb</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='backendModel'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>emulator</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>external</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='backendVersion'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>2.0</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </tpm>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <redirdev supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='bus'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>usb</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </redirdev>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <channel supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>pty</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>unix</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </channel>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <crypto supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='model'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>qemu</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='backendModel'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>builtin</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </crypto>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <interface supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='backendType'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>default</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>passt</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </interface>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <panic supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>isa</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>hyperv</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </panic>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <console supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>null</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>vc</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>pty</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>dev</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>file</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>pipe</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>stdio</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>udp</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>tcp</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>unix</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>qemu-vdagent</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>dbus</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </console>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <gic supported='no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <vmcoreinfo supported='yes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <genid supported='yes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <backingStoreInput supported='yes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <backup supported='yes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <async-teardown supported='yes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <ps2 supported='yes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <sev supported='no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <sgx supported='no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <hyperv supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='features'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>relaxed</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>vapic</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>spinlocks</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>vpindex</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>runtime</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>synic</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>stimer</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>reset</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>vendor_id</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>frequencies</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>reenlightenment</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>tlbflush</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>ipi</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>avic</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>emsr_bitmap</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>xmm_input</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <defaults>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <spinlocks>4095</spinlocks>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <stimer_direct>on</stimer_direct>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </defaults>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </hyperv>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <launchSecurity supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='sectype'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>tdx</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </launchSecurity>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:53:30 np0005546909 nova_compute[187208]: </domainCapabilities>
Dec  5 06:53:30 np0005546909 nova_compute[187208]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 06:53:30 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.881 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  5 06:53:30 np0005546909 nova_compute[187208]: <domainCapabilities>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <domain>kvm</domain>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <arch>i686</arch>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <vcpu max='4096'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <iothreads supported='yes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <os supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <enum name='firmware'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <loader supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>rom</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>pflash</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='readonly'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>yes</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>no</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='secure'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>no</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </loader>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:  <cpu>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <mode name='host-passthrough' supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='hostPassthroughMigratable'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>on</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>off</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <mode name='maximum' supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <enum name='maximumMigratable'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>on</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <value>off</value>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <mode name='host-model' supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <vendor>AMD</vendor>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='x2apic'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='hypervisor'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='stibp'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='ssbd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='overflow-recov'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='succor'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='ibrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='lbrv'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='tsc-scale'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='flushbyasid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='pause-filter'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='pfthreshold'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <feature policy='disable' name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:    <mode name='custom' supported='yes'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-noTSX'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cooperlake'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cooperlake-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Cooperlake-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Denverton'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Denverton-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Denverton-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Denverton-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Dhyana-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Genoa'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amd-psfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='auto-ibrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='stibp-always-on'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amd-psfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='auto-ibrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='stibp-always-on'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Milan'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Milan-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Milan-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amd-psfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='stibp-always-on'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='EPYC-v4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='GraniteRapids'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='prefetchiti'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='GraniteRapids-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='prefetchiti'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='GraniteRapids-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx10'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx10-128'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx10-256'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx10-512'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='prefetchiti'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-noTSX'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v1'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v2'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v3'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v4'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server'>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:30 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v5'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v6'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v7'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='KnightsMill'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512er'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512pf'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='KnightsMill-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512er'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512pf'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G4-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G5'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tbm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G5-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tbm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SierraForest'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cmpccxadd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SierraForest-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cmpccxadd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v5'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='athlon'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='athlon-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='core2duo'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='core2duo-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='coreduo'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='coreduo-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='n270'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='n270-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='phenom'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='phenom-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <memoryBacking supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <enum name='sourceType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>file</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>anonymous</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>memfd</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </memoryBacking>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <disk supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='diskDevice'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>disk</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>cdrom</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>floppy</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>lun</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='bus'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>fdc</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>scsi</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>usb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>sata</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio-transitional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio-non-transitional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <graphics supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vnc</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>egl-headless</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>dbus</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </graphics>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <video supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='modelType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vga</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>cirrus</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>none</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>bochs</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>ramfb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <hostdev supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='mode'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>subsystem</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='startupPolicy'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>default</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>mandatory</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>requisite</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>optional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='subsysType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>usb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pci</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>scsi</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='capsType'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='pciBackend'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </hostdev>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <rng supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio-transitional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio-non-transitional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendModel'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>random</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>egd</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>builtin</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <filesystem supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='driverType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>path</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>handle</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtiofs</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </filesystem>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <tpm supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tpm-tis</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tpm-crb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendModel'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>emulator</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>external</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendVersion'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>2.0</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </tpm>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <redirdev supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='bus'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>usb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </redirdev>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <channel supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pty</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>unix</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </channel>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <crypto supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>qemu</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendModel'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>builtin</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </crypto>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <interface supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>default</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>passt</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </interface>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <panic supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>isa</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>hyperv</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </panic>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <console supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>null</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vc</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pty</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>dev</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>file</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pipe</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>stdio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>udp</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tcp</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>unix</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>qemu-vdagent</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>dbus</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </console>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <gic supported='no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <vmcoreinfo supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <genid supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <backingStoreInput supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <backup supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <async-teardown supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <ps2 supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <sev supported='no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <sgx supported='no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <hyperv supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='features'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>relaxed</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vapic</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>spinlocks</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vpindex</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>runtime</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>synic</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>stimer</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>reset</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vendor_id</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>frequencies</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>reenlightenment</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tlbflush</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>ipi</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>avic</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>emsr_bitmap</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>xmm_input</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <defaults>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <spinlocks>4095</spinlocks>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <stimer_direct>on</stimer_direct>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </defaults>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </hyperv>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <launchSecurity supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='sectype'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tdx</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </launchSecurity>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:53:31 np0005546909 nova_compute[187208]: </domainCapabilities>
Dec  5 06:53:31 np0005546909 nova_compute[187208]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.929 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:30.933 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  5 06:53:31 np0005546909 nova_compute[187208]: <domainCapabilities>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <domain>kvm</domain>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <arch>x86_64</arch>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <vcpu max='240'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <iothreads supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <os supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <enum name='firmware'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <loader supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>rom</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pflash</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='readonly'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>yes</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>no</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='secure'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>no</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </loader>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <cpu>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <mode name='host-passthrough' supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='hostPassthroughMigratable'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>on</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>off</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <mode name='maximum' supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='maximumMigratable'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>on</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>off</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <mode name='host-model' supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <vendor>AMD</vendor>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='x2apic'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='hypervisor'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='stibp'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='ssbd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='overflow-recov'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='succor'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='ibrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='lbrv'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='tsc-scale'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='flushbyasid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='pause-filter'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='pfthreshold'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='disable' name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <mode name='custom' supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-noTSX'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cooperlake'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cooperlake-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cooperlake-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Denverton'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Denverton-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Denverton-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Denverton-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Dhyana-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Genoa'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amd-psfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='auto-ibrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='stibp-always-on'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amd-psfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='auto-ibrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='stibp-always-on'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Milan'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Milan-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Milan-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amd-psfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='stibp-always-on'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='GraniteRapids'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='prefetchiti'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='GraniteRapids-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='prefetchiti'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='GraniteRapids-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx10'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx10-128'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx10-256'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx10-512'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='prefetchiti'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-noTSX'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v5'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v6'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v7'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='KnightsMill'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512er'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512pf'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='KnightsMill-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512er'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512pf'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G4-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G5'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tbm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G5-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tbm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SierraForest'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cmpccxadd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SierraForest-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cmpccxadd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v5'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='athlon'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='athlon-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='core2duo'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='core2duo-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='coreduo'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='coreduo-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='n270'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='n270-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='phenom'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='phenom-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <memoryBacking supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <enum name='sourceType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>file</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>anonymous</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>memfd</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </memoryBacking>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <disk supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='diskDevice'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>disk</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>cdrom</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>floppy</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>lun</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='bus'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>ide</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>fdc</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>scsi</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>usb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>sata</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio-transitional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio-non-transitional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <graphics supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vnc</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>egl-headless</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>dbus</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </graphics>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <video supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='modelType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vga</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>cirrus</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>none</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>bochs</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>ramfb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <hostdev supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='mode'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>subsystem</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='startupPolicy'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>default</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>mandatory</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>requisite</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>optional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='subsysType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>usb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pci</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>scsi</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='capsType'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='pciBackend'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </hostdev>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <rng supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio-transitional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio-non-transitional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendModel'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>random</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>egd</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>builtin</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <filesystem supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='driverType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>path</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>handle</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtiofs</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </filesystem>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <tpm supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tpm-tis</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tpm-crb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendModel'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>emulator</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>external</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendVersion'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>2.0</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </tpm>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <redirdev supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='bus'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>usb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </redirdev>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <channel supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pty</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>unix</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </channel>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <crypto supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>qemu</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendModel'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>builtin</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </crypto>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <interface supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>default</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>passt</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </interface>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <panic supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>isa</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>hyperv</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </panic>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <console supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>null</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vc</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pty</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>dev</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>file</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pipe</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>stdio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>udp</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tcp</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>unix</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>qemu-vdagent</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>dbus</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </console>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <gic supported='no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <vmcoreinfo supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <genid supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <backingStoreInput supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <backup supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <async-teardown supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <ps2 supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <sev supported='no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <sgx supported='no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <hyperv supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='features'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>relaxed</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vapic</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>spinlocks</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vpindex</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>runtime</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>synic</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>stimer</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>reset</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vendor_id</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>frequencies</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>reenlightenment</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tlbflush</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>ipi</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>avic</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>emsr_bitmap</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>xmm_input</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <defaults>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <spinlocks>4095</spinlocks>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <stimer_direct>on</stimer_direct>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </defaults>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </hyperv>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <launchSecurity supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='sectype'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tdx</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </launchSecurity>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:53:31 np0005546909 nova_compute[187208]: </domainCapabilities>
Dec  5 06:53:31 np0005546909 nova_compute[187208]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.011 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  5 06:53:31 np0005546909 nova_compute[187208]: <domainCapabilities>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <path>/usr/libexec/qemu-kvm</path>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <domain>kvm</domain>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <arch>x86_64</arch>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <vcpu max='4096'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <iothreads supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <os supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <enum name='firmware'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>efi</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <loader supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>rom</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pflash</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='readonly'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>yes</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>no</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='secure'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>yes</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>no</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </loader>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <cpu>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <mode name='host-passthrough' supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='hostPassthroughMigratable'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>on</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>off</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <mode name='maximum' supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='maximumMigratable'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>on</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>off</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <mode name='host-model' supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <vendor>AMD</vendor>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='x2apic'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='tsc-deadline'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='hypervisor'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='tsc_adjust'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='spec-ctrl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='stibp'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='ssbd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='cmp_legacy'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='overflow-recov'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='succor'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='ibrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='amd-ssbd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='virt-ssbd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='lbrv'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='tsc-scale'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='vmcb-clean'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='flushbyasid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='pause-filter'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='pfthreshold'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='svme-addr-chk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <feature policy='disable' name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <mode name='custom' supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-noTSX'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Broadwell-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cascadelake-Server-v5'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cooperlake'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cooperlake-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Cooperlake-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Denverton'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Denverton-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Denverton-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Denverton-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Dhyana-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Genoa'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amd-psfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='auto-ibrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='stibp-always-on'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Genoa-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amd-psfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='auto-ibrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='stibp-always-on'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Milan'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Milan-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Milan-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amd-psfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='no-nested-data-bp'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='null-sel-clr-base'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='stibp-always-on'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-Rome-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='EPYC-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='GraniteRapids'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='prefetchiti'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='GraniteRapids-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='prefetchiti'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='GraniteRapids-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx10'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx10-128'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx10-256'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx10-512'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='prefetchiti'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-noTSX'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Haswell-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-noTSX'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v5'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v6'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Icelake-Server-v7'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='IvyBridge-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='KnightsMill'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512er'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512pf'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='KnightsMill-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-4fmaps'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-4vnniw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512er'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512pf'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G4-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G5'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tbm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Opteron_G5-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fma4'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tbm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xop'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SapphireRapids-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='amx-tile'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-bf16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-fp16'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512-vpopcntdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bitalg'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vbmi2'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrc'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fzrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='la57'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='taa-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='tsx-ldtrk'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xfd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SierraForest'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cmpccxadd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='SierraForest-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-ifma'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-ne-convert'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx-vnni-int8'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='bus-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cmpccxadd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fbsdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='fsrs'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ibrs-all'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mcdt-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pbrsb-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='psdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='sbdr-ssdp-no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='serialize'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vaes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='vpclmulqdq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Client-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='hle'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='rtm'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Skylake-Server-v5'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512bw'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512cd'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512dq'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512f'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='avx512vl'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='invpcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pcid'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='pku'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='mpx'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v2'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v3'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='core-capability'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='split-lock-detect'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='Snowridge-v4'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='cldemote'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='erms'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='gfni'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdir64b'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='movdiri'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='xsaves'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='athlon'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='athlon-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='core2duo'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='core2duo-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='coreduo'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='coreduo-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='n270'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='n270-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='ss'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='phenom'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <blockers model='phenom-v1'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnow'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <feature name='3dnowext'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </blockers>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </mode>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <memoryBacking supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <enum name='sourceType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>file</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>anonymous</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <value>memfd</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </memoryBacking>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <disk supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='diskDevice'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>disk</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>cdrom</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>floppy</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>lun</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='bus'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>fdc</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>scsi</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>usb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>sata</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio-transitional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio-non-transitional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <graphics supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vnc</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>egl-headless</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>dbus</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </graphics>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <video supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='modelType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vga</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>cirrus</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>none</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>bochs</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>ramfb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <hostdev supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='mode'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>subsystem</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='startupPolicy'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>default</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>mandatory</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>requisite</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>optional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='subsysType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>usb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pci</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>scsi</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='capsType'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='pciBackend'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </hostdev>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <rng supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio-transitional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtio-non-transitional</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendModel'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>random</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>egd</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>builtin</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <filesystem supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='driverType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>path</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>handle</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>virtiofs</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </filesystem>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <tpm supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tpm-tis</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tpm-crb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendModel'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>emulator</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>external</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendVersion'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>2.0</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </tpm>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <redirdev supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='bus'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>usb</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </redirdev>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <channel supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pty</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>unix</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </channel>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <crypto supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>qemu</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendModel'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>builtin</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </crypto>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <interface supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='backendType'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>default</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>passt</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </interface>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <panic supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='model'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>isa</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>hyperv</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </panic>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <console supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='type'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>null</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vc</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pty</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>dev</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>file</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>pipe</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>stdio</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>udp</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tcp</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>unix</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>qemu-vdagent</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>dbus</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </console>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <gic supported='no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <vmcoreinfo supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <genid supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <backingStoreInput supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <backup supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <async-teardown supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <ps2 supported='yes'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <sev supported='no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <sgx supported='no'/>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <hyperv supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='features'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>relaxed</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vapic</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>spinlocks</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vpindex</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>runtime</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>synic</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>stimer</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>reset</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>vendor_id</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>frequencies</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>reenlightenment</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tlbflush</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>ipi</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>avic</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>emsr_bitmap</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>xmm_input</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <defaults>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <spinlocks>4095</spinlocks>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <stimer_direct>on</stimer_direct>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <tlbflush_direct>on</tlbflush_direct>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <tlbflush_extended>on</tlbflush_extended>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </defaults>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </hyperv>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    <launchSecurity supported='yes'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      <enum name='sectype'>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:        <value>tdx</value>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:      </enum>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:    </launchSecurity>
Dec  5 06:53:31 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:53:31 np0005546909 nova_compute[187208]: </domainCapabilities>
Dec  5 06:53:31 np0005546909 nova_compute[187208]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.074 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.074 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.074 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.074 187212 INFO nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Secure Boot support detected#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.077 187212 INFO nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.077 187212 INFO nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.086 187212 DEBUG nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.113 187212 INFO nova.virt.node [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Determined node identity 5111707b-bdc3-4252-b5b7-b3e96ff05344 from /var/lib/nova/compute_id#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.134 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Verified node 5111707b-bdc3-4252-b5b7-b3e96ff05344 matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.165 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec  5 06:53:31 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9-userdata-shm.mount: Deactivated successfully.
Dec  5 06:53:31 np0005546909 systemd[1]: var-lib-containers-storage-overlay-bd705a07c7bc29a9eafa697375b74dc2ecbeb5bf38a91f03c149fb0a99e25516-merged.mount: Deactivated successfully.
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.627 187212 ERROR nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Could not retrieve compute node resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '5111707b-bdc3-4252-b5b7-b3e96ff05344' not found: No resource provider with uuid 5111707b-bdc3-4252-b5b7-b3e96ff05344 found  ", "request_id": "req-774a28d9-a88e-4e5a-9372-5429c75d68b0"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '5111707b-bdc3-4252-b5b7-b3e96ff05344' not found: No resource provider with uuid 5111707b-bdc3-4252-b5b7-b3e96ff05344 found  ", "request_id": "req-774a28d9-a88e-4e5a-9372-5429c75d68b0"}]}#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.646 187212 DEBUG oslo_concurrency.lockutils [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.646 187212 DEBUG oslo_concurrency.lockutils [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.647 187212 DEBUG oslo_concurrency.lockutils [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.647 187212 DEBUG nova.compute.resource_tracker [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.810 187212 WARNING nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.811 187212 DEBUG nova.compute.resource_tracker [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6202MB free_disk=73.54329299926758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.812 187212 DEBUG oslo_concurrency.lockutils [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:53:31 np0005546909 nova_compute[187208]: 2025-12-05 11:53:31.812 187212 DEBUG oslo_concurrency.lockutils [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:53:31 np0005546909 podman[187419]: 2025-12-05 11:53:31.880467105 +0000 UTC m=+2.148683864 container cleanup 8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 06:53:31 np0005546909 systemd[1]: libpod-conmon-8fff503a8fca280bf637a2afc48747a8a66a8b08a3327fee31082c3568d966e9.scope: Deactivated successfully.
Dec  5 06:53:32 np0005546909 systemd[1]: session-23.scope: Deactivated successfully.
Dec  5 06:53:32 np0005546909 systemd[1]: session-23.scope: Consumed 2min 1.738s CPU time.
Dec  5 06:53:32 np0005546909 systemd-logind[792]: Session 23 logged out. Waiting for processes to exit.
Dec  5 06:53:32 np0005546909 systemd-logind[792]: Removed session 23.
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.325 187212 ERROR nova.compute.resource_tracker [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '5111707b-bdc3-4252-b5b7-b3e96ff05344' not found: No resource provider with uuid 5111707b-bdc3-4252-b5b7-b3e96ff05344 found  ", "request_id": "req-9e1b7cdc-c1d2-4580-865f-85886820152d"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '5111707b-bdc3-4252-b5b7-b3e96ff05344' not found: No resource provider with uuid 5111707b-bdc3-4252-b5b7-b3e96ff05344 found  ", "request_id": "req-9e1b7cdc-c1d2-4580-865f-85886820152d"}]}#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.326 187212 DEBUG nova.compute.resource_tracker [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.326 187212 DEBUG nova.compute.resource_tracker [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.787 187212 INFO nova.scheduler.client.report [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [req-122471f4-9d23-45f9-8412-7e0ca534d4f2] Created resource provider record via placement API for resource provider with UUID 5111707b-bdc3-4252-b5b7-b3e96ff05344 and name compute-0.ctlplane.example.com.#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.819 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec  5 06:53:32 np0005546909 nova_compute[187208]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.819 187212 INFO nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] kernel doesn't support AMD SEV#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.820 187212 DEBUG nova.compute.provider_tree [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.821 187212 DEBUG nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.877 187212 DEBUG nova.scheduler.client.report [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Updated inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.877 187212 DEBUG nova.compute.provider_tree [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Updating resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.878 187212 DEBUG nova.compute.provider_tree [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.970 187212 DEBUG nova.compute.provider_tree [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Updating resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.993 187212 DEBUG nova.compute.resource_tracker [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.993 187212 DEBUG oslo_concurrency.lockutils [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:53:32 np0005546909 nova_compute[187208]: 2025-12-05 11:53:32.994 187212 DEBUG nova.service [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Dec  5 06:53:33 np0005546909 nova_compute[187208]: 2025-12-05 11:53:33.070 187212 DEBUG nova.service [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Dec  5 06:53:33 np0005546909 nova_compute[187208]: 2025-12-05 11:53:33.070 187212 DEBUG nova.servicegroup.drivers.db [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Dec  5 06:53:37 np0005546909 podman[187510]: 2025-12-05 11:53:37.224957642 +0000 UTC m=+0.073947282 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Dec  5 06:53:38 np0005546909 systemd-logind[792]: New session 25 of user zuul.
Dec  5 06:53:38 np0005546909 systemd[1]: Started Session 25 of User zuul.
Dec  5 06:53:39 np0005546909 python3.9[187683]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  5 06:53:40 np0005546909 python3.9[187839]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:53:40 np0005546909 systemd[1]: Reloading.
Dec  5 06:53:40 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:53:40 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:53:41 np0005546909 nova_compute[187208]: 2025-12-05 11:53:41.072 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:53:41 np0005546909 nova_compute[187208]: 2025-12-05 11:53:41.097 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:53:41 np0005546909 python3.9[188025]: ansible-ansible.builtin.service_facts Invoked
Dec  5 06:53:41 np0005546909 network[188042]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  5 06:53:42 np0005546909 network[188043]: 'network-scripts' will be removed from distribution in near future.
Dec  5 06:53:42 np0005546909 network[188044]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  5 06:53:44 np0005546909 podman[188155]: 2025-12-05 11:53:44.791792705 +0000 UTC m=+0.090494108 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec  5 06:53:45 np0005546909 python3.9[188343]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:53:46 np0005546909 python3.9[188496]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:53:46 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 06:53:47 np0005546909 python3.9[188649]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:53:48 np0005546909 python3.9[188801]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:53:49 np0005546909 python3.9[188953]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  5 06:53:49 np0005546909 python3.9[189105]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:53:49 np0005546909 systemd[1]: Reloading.
Dec  5 06:53:50 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:53:50 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:53:50 np0005546909 podman[189107]: 2025-12-05 11:53:50.021434774 +0000 UTC m=+0.073508099 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec  5 06:53:50 np0005546909 python3.9[189312]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:53:51 np0005546909 python3.9[189465]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:53:52 np0005546909 python3.9[189615]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:53:53 np0005546909 python3.9[189767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:53:53 np0005546909 python3.9[189888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935632.6334188-133-180122835217658/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:53:54 np0005546909 python3.9[190040]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec  5 06:53:55 np0005546909 python3.9[190192]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec  5 06:53:56 np0005546909 python3.9[190345]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  5 06:53:57 np0005546909 python3.9[190503]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  5 06:53:59 np0005546909 python3.9[190661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:53:59 np0005546909 python3.9[190782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764935638.7289858-201-62735448118087/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:00 np0005546909 python3.9[190932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:00 np0005546909 python3.9[191053]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764935639.8859148-201-12977640296662/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:01 np0005546909 python3.9[191203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:02 np0005546909 python3.9[191324]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764935641.0302138-201-201061544219823/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:02 np0005546909 python3.9[191474]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:54:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:54:02.999 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:54:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:54:03.000 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:54:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:54:03.000 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:54:03 np0005546909 python3.9[191626]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:54:04 np0005546909 python3.9[191778]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:04 np0005546909 python3.9[191899]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935643.5685701-260-17046767679443/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:05 np0005546909 python3.9[192049]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:05 np0005546909 python3.9[192125]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:06 np0005546909 python3.9[192275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:07 np0005546909 python3.9[192396]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935645.9814017-260-231313481189936/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:07 np0005546909 podman[192520]: 2025-12-05 11:54:07.599010919 +0000 UTC m=+0.075284771 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 06:54:07 np0005546909 python3.9[192556]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:08 np0005546909 python3.9[192686]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935647.1825607-260-26876874097228/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:09 np0005546909 python3.9[192837]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:09 np0005546909 python3.9[192958]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935648.4935868-260-95108047884505/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:10 np0005546909 python3.9[193109]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:10 np0005546909 python3.9[193230]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935649.667252-260-116426040866187/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:11 np0005546909 python3.9[193380]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:11 np0005546909 python3.9[193501]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935650.7880056-260-185492813421047/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:12 np0005546909 python3.9[193651]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:12 np0005546909 python3.9[193774]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935651.897296-260-202891361344810/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:13 np0005546909 python3.9[193924]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:13 np0005546909 python3.9[194045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935652.9834416-260-69713075083964/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:14 np0005546909 python3.9[194195]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:15 np0005546909 podman[194291]: 2025-12-05 11:54:15.007884799 +0000 UTC m=+0.103807302 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2)
Dec  5 06:54:15 np0005546909 python3.9[194330]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935654.089444-260-97552924682234/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:15 np0005546909 python3.9[194495]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:16 np0005546909 python3.9[194616]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935655.2563097-260-133458606956568/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:16 np0005546909 python3.9[194766]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:17 np0005546909 python3.9[194842]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:17 np0005546909 python3.9[194994]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:18 np0005546909 python3.9[195070]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:19 np0005546909 python3.9[195220]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:19 np0005546909 python3.9[195296]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:20 np0005546909 python3.9[195449]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:20 np0005546909 podman[195574]: 2025-12-05 11:54:20.744967278 +0000 UTC m=+0.067446364 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 06:54:20 np0005546909 python3.9[195622]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:21 np0005546909 python3.9[195774]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:54:22 np0005546909 python3.9[195926]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:54:22 np0005546909 systemd[1]: Reloading.
Dec  5 06:54:22 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:54:22 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:54:22 np0005546909 systemd[1]: Listening on Podman API Socket.
Dec  5 06:54:23 np0005546909 python3.9[196117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:23 np0005546909 python3.9[196240]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935662.9978337-482-31954752467497/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:54:24 np0005546909 python3.9[196316]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:24 np0005546909 python3.9[196439]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935662.9978337-482-31954752467497/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:54:26 np0005546909 python3.9[196591]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Dec  5 06:54:26 np0005546909 python3.9[196743]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 06:54:28 np0005546909 python3[196895]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 06:54:28 np0005546909 podman[196930]: 2025-12-05 11:54:28.281975501 +0000 UTC m=+0.056510489 container create 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 06:54:28 np0005546909 podman[196930]: 2025-12-05 11:54:28.250922617 +0000 UTC m=+0.025457645 image pull 343ba269c9fe0a56d7572c8ca328dbce002017c4dd4986f43667971dd03085c2 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec  5 06:54:28 np0005546909 python3[196895]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Dec  5 06:54:29 np0005546909 python3.9[197117]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:54:29 np0005546909 python3.9[197271]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.063 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.063 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.063 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.078 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.079 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.079 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.079 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.080 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.080 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.080 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.081 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.109 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.109 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.110 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.110 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.260 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.261 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6174MB free_disk=73.5428237915039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.262 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.262 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.327 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.328 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.354 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.367 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.368 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 06:54:30 np0005546909 nova_compute[187208]: 2025-12-05 11:54:30.368 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:54:30 np0005546909 python3.9[197422]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935669.9160051-546-16239367757230/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:31 np0005546909 python3.9[197498]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:54:31 np0005546909 systemd[1]: Reloading.
Dec  5 06:54:31 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:54:31 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:54:32 np0005546909 python3.9[197610]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:54:32 np0005546909 systemd[1]: Reloading.
Dec  5 06:54:32 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:54:32 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:54:32 np0005546909 systemd[1]: Starting ceilometer_agent_compute container...
Dec  5 06:54:32 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:54:32 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:32 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:32 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:32 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:32 np0005546909 systemd[1]: Started /usr/bin/podman healthcheck run 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.
Dec  5 06:54:32 np0005546909 podman[197649]: 2025-12-05 11:54:32.815613005 +0000 UTC m=+0.120055611 container init 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: + sudo -E kolla_set_configs
Dec  5 06:54:32 np0005546909 podman[197649]: 2025-12-05 11:54:32.837392792 +0000 UTC m=+0.141835378 container start 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: sudo: unable to send audit message: Operation not permitted
Dec  5 06:54:32 np0005546909 podman[197649]: ceilometer_agent_compute
Dec  5 06:54:32 np0005546909 systemd[1]: Started ceilometer_agent_compute container.
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Validating config file
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Copying service configuration files
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: INFO:__main__:Writing out command to execute
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: ++ cat /run_command
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: + ARGS=
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: + sudo kolla_copy_cacerts
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: sudo: unable to send audit message: Operation not permitted
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: + [[ ! -n '' ]]
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: + . kolla_extend_start
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: + umask 0022
Dec  5 06:54:32 np0005546909 ceilometer_agent_compute[197665]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec  5 06:54:32 np0005546909 podman[197672]: 2025-12-05 11:54:32.927715125 +0000 UTC m=+0.075328792 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm)
Dec  5 06:54:32 np0005546909 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-6e2187f3ee73ebb3.service: Main process exited, code=exited, status=1/FAILURE
Dec  5 06:54:32 np0005546909 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-6e2187f3ee73ebb3.service: Failed with result 'exit-code'.
Dec  5 06:54:33 np0005546909 python3.9[197850]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:54:33 np0005546909 systemd[1]: Stopping ceilometer_agent_compute container...
Dec  5 06:54:33 np0005546909 systemd[1]: libpod-5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.scope: Deactivated successfully.
Dec  5 06:54:33 np0005546909 podman[197854]: 2025-12-05 11:54:33.75597653 +0000 UTC m=+0.052678429 container died 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 06:54:33 np0005546909 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-6e2187f3ee73ebb3.timer: Deactivated successfully.
Dec  5 06:54:33 np0005546909 systemd[1]: Stopped /usr/bin/podman healthcheck run 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.
Dec  5 06:54:33 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-userdata-shm.mount: Deactivated successfully.
Dec  5 06:54:33 np0005546909 systemd[1]: var-lib-containers-storage-overlay-260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5-merged.mount: Deactivated successfully.
Dec  5 06:54:33 np0005546909 podman[197854]: 2025-12-05 11:54:33.954099328 +0000 UTC m=+0.250801207 container cleanup 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 06:54:33 np0005546909 podman[197854]: ceilometer_agent_compute
Dec  5 06:54:34 np0005546909 podman[197884]: ceilometer_agent_compute
Dec  5 06:54:34 np0005546909 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Dec  5 06:54:34 np0005546909 systemd[1]: Stopped ceilometer_agent_compute container.
Dec  5 06:54:34 np0005546909 systemd[1]: Starting ceilometer_agent_compute container...
Dec  5 06:54:34 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:54:34 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:34 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:34 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:34 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/260799ebb1a8b3a0edf8e2b0fc5a557947aaf4a404b2b090fe793b5a44e5d6e5/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:34 np0005546909 systemd[1]: Started /usr/bin/podman healthcheck run 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.
Dec  5 06:54:34 np0005546909 podman[197897]: 2025-12-05 11:54:34.188258396 +0000 UTC m=+0.128705460 container init 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: + sudo -E kolla_set_configs
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: sudo: unable to send audit message: Operation not permitted
Dec  5 06:54:34 np0005546909 podman[197897]: 2025-12-05 11:54:34.22172839 +0000 UTC m=+0.162175464 container start 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  5 06:54:34 np0005546909 podman[197897]: ceilometer_agent_compute
Dec  5 06:54:34 np0005546909 systemd[1]: Started ceilometer_agent_compute container.
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Validating config file
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Copying service configuration files
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: INFO:__main__:Writing out command to execute
Dec  5 06:54:34 np0005546909 podman[197920]: 2025-12-05 11:54:34.281468922 +0000 UTC m=+0.047138260 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: ++ cat /run_command
Dec  5 06:54:34 np0005546909 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-2261b38cdfea5d2d.service: Main process exited, code=exited, status=1/FAILURE
Dec  5 06:54:34 np0005546909 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-2261b38cdfea5d2d.service: Failed with result 'exit-code'.
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: + ARGS=
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: + sudo kolla_copy_cacerts
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: sudo: unable to send audit message: Operation not permitted
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: + [[ ! -n '' ]]
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: + . kolla_extend_start
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: + umask 0022
Dec  5 06:54:34 np0005546909 ceilometer_agent_compute[197913]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec  5 06:54:35 np0005546909 python3.9[198096]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.127 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.128 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.129 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.130 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.131 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.132 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.133 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.134 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.135 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.136 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.137 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.138 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.139 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.140 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.141 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.142 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.143 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.160 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.162 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.162 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.245 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.327 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.328 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.329 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.330 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.331 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.332 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.333 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.334 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.335 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.336 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.337 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.338 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.339 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.340 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.341 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.342 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.343 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.344 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.345 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.346 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.347 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.349 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.355 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:54:35.362 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:54:35 np0005546909 python3.9[198224]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935674.5946898-578-238661738900389/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:54:36 np0005546909 python3.9[198377]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Dec  5 06:54:37 np0005546909 python3.9[198529]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 06:54:37 np0005546909 podman[198681]: 2025-12-05 11:54:37.707880811 +0000 UTC m=+0.053728899 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec  5 06:54:37 np0005546909 python3[198682]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 06:54:38 np0005546909 podman[198736]: 2025-12-05 11:54:38.12812814 +0000 UTC m=+0.047028686 container create 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter)
Dec  5 06:54:38 np0005546909 podman[198736]: 2025-12-05 11:54:38.10140481 +0000 UTC m=+0.020305326 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec  5 06:54:38 np0005546909 python3[198682]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Dec  5 06:54:38 np0005546909 python3.9[198925]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:54:39 np0005546909 python3.9[199079]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:40 np0005546909 python3.9[199230]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935679.669225-631-211216472428256/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:40 np0005546909 auditd[699]: Audit daemon rotating log files
Dec  5 06:54:40 np0005546909 python3.9[199306]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:54:40 np0005546909 systemd[1]: Reloading.
Dec  5 06:54:40 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:54:40 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:54:41 np0005546909 python3.9[199417]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:54:41 np0005546909 systemd[1]: Reloading.
Dec  5 06:54:41 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:54:41 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:54:42 np0005546909 systemd[1]: Starting node_exporter container...
Dec  5 06:54:42 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:54:42 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a4f5f26e02c0fc68612977d7ab0ce73f399aef6e31c987085b1869d42b12d3f/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:42 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a4f5f26e02c0fc68612977d7ab0ce73f399aef6e31c987085b1869d42b12d3f/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:42 np0005546909 systemd[1]: Started /usr/bin/podman healthcheck run 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.
Dec  5 06:54:42 np0005546909 podman[199457]: 2025-12-05 11:54:42.710505684 +0000 UTC m=+0.541065266 container init 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.729Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.729Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.729Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.730Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.730Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.730Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.730Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.731Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.731Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.731Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=arp
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=bcache
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=bonding
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=btrfs
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=conntrack
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=cpu
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=diskstats
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=edac
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=filefd
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=filesystem
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=infiniband
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=ipvs
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=loadavg
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=mdadm
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=meminfo
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=netclass
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=netdev
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=netstat
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=nfs
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=nfsd
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=nvme
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=schedstat
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=sockstat
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=softnet
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=systemd
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=tapestats
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=vmstat
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=xfs
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.732Z caller=node_exporter.go:117 level=info collector=zfs
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.733Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec  5 06:54:42 np0005546909 node_exporter[199472]: ts=2025-12-05T11:54:42.734Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec  5 06:54:42 np0005546909 podman[199457]: 2025-12-05 11:54:42.751514371 +0000 UTC m=+0.582073953 container start 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec  5 06:54:42 np0005546909 podman[199457]: node_exporter
Dec  5 06:54:42 np0005546909 systemd[1]: Started node_exporter container.
Dec  5 06:54:42 np0005546909 podman[199481]: 2025-12-05 11:54:42.827881624 +0000 UTC m=+0.065746369 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 06:54:43 np0005546909 python3.9[199655]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:54:43 np0005546909 systemd[1]: Stopping node_exporter container...
Dec  5 06:54:43 np0005546909 systemd[1]: libpod-5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.scope: Deactivated successfully.
Dec  5 06:54:43 np0005546909 podman[199659]: 2025-12-05 11:54:43.730295653 +0000 UTC m=+0.057059809 container died 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 06:54:43 np0005546909 systemd[1]: 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5-655ef2dd01a3c178.timer: Deactivated successfully.
Dec  5 06:54:43 np0005546909 systemd[1]: Stopped /usr/bin/podman healthcheck run 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.
Dec  5 06:54:43 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5-userdata-shm.mount: Deactivated successfully.
Dec  5 06:54:43 np0005546909 systemd[1]: var-lib-containers-storage-overlay-5a4f5f26e02c0fc68612977d7ab0ce73f399aef6e31c987085b1869d42b12d3f-merged.mount: Deactivated successfully.
Dec  5 06:54:43 np0005546909 podman[199659]: 2025-12-05 11:54:43.912904545 +0000 UTC m=+0.239668701 container cleanup 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 06:54:43 np0005546909 podman[199659]: node_exporter
Dec  5 06:54:43 np0005546909 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec  5 06:54:43 np0005546909 podman[199689]: node_exporter
Dec  5 06:54:43 np0005546909 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Dec  5 06:54:43 np0005546909 systemd[1]: Stopped node_exporter container.
Dec  5 06:54:43 np0005546909 systemd[1]: Starting node_exporter container...
Dec  5 06:54:44 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:54:44 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a4f5f26e02c0fc68612977d7ab0ce73f399aef6e31c987085b1869d42b12d3f/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:44 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a4f5f26e02c0fc68612977d7ab0ce73f399aef6e31c987085b1869d42b12d3f/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:44 np0005546909 systemd[1]: Started /usr/bin/podman healthcheck run 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.
Dec  5 06:54:44 np0005546909 podman[199703]: 2025-12-05 11:54:44.322542846 +0000 UTC m=+0.321939914 container init 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.338Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.338Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.338Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.339Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.339Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.339Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.339Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.340Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.340Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=arp
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=bcache
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=bonding
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=btrfs
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=conntrack
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=cpu
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=diskstats
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=edac
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=filefd
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=filesystem
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=infiniband
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=ipvs
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=loadavg
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=mdadm
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=meminfo
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=netclass
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=netdev
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=netstat
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=nfs
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=nfsd
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=nvme
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=schedstat
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=sockstat
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=softnet
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=systemd
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=tapestats
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=vmstat
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=xfs
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.341Z caller=node_exporter.go:117 level=info collector=zfs
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.342Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec  5 06:54:44 np0005546909 node_exporter[199718]: ts=2025-12-05T11:54:44.343Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec  5 06:54:44 np0005546909 podman[199703]: 2025-12-05 11:54:44.356627365 +0000 UTC m=+0.356024403 container start 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 06:54:44 np0005546909 podman[199703]: node_exporter
Dec  5 06:54:44 np0005546909 systemd[1]: Started node_exporter container.
Dec  5 06:54:44 np0005546909 podman[199727]: 2025-12-05 11:54:44.427362445 +0000 UTC m=+0.062878095 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 06:54:45 np0005546909 python3.9[199900]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:45 np0005546909 podman[199930]: 2025-12-05 11:54:45.231133202 +0000 UTC m=+0.086589337 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 06:54:45 np0005546909 python3.9[200049]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935684.5876107-663-9829364768610/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:54:46 np0005546909 python3.9[200201]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec  5 06:54:46 np0005546909 python3.9[200353]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 06:54:47 np0005546909 python3[200505]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 06:54:49 np0005546909 podman[200518]: 2025-12-05 11:54:49.59740021 +0000 UTC m=+1.655622865 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec  5 06:54:49 np0005546909 podman[200611]: 2025-12-05 11:54:49.731239672 +0000 UTC m=+0.040984767 container create 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible)
Dec  5 06:54:49 np0005546909 podman[200611]: 2025-12-05 11:54:49.709611141 +0000 UTC m=+0.019356256 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec  5 06:54:49 np0005546909 python3[200505]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Dec  5 06:54:50 np0005546909 python3.9[200799]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:54:51 np0005546909 podman[200925]: 2025-12-05 11:54:51.030058273 +0000 UTC m=+0.080628466 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 06:54:51 np0005546909 python3.9[200969]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:51 np0005546909 python3.9[201123]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935691.2705548-716-266877707357472/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:54:52 np0005546909 python3.9[201199]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:54:52 np0005546909 systemd[1]: Reloading.
Dec  5 06:54:52 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:54:52 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:54:53 np0005546909 python3.9[201310]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:54:53 np0005546909 systemd[1]: Reloading.
Dec  5 06:54:53 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:54:53 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:54:53 np0005546909 systemd[1]: Starting podman_exporter container...
Dec  5 06:54:53 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:54:53 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/467cb46bad3a0a5e7c1066989acaca15e9231985dac6b8e77c4852d35c2913af/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:53 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/467cb46bad3a0a5e7c1066989acaca15e9231985dac6b8e77c4852d35c2913af/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:53 np0005546909 systemd[1]: Started /usr/bin/podman healthcheck run 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.
Dec  5 06:54:53 np0005546909 podman[201350]: 2025-12-05 11:54:53.845403762 +0000 UTC m=+0.158567963 container init 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 06:54:53 np0005546909 podman_exporter[201365]: ts=2025-12-05T11:54:53.861Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec  5 06:54:53 np0005546909 podman_exporter[201365]: ts=2025-12-05T11:54:53.861Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec  5 06:54:53 np0005546909 podman_exporter[201365]: ts=2025-12-05T11:54:53.861Z caller=handler.go:94 level=info msg="enabled collectors"
Dec  5 06:54:53 np0005546909 podman_exporter[201365]: ts=2025-12-05T11:54:53.861Z caller=handler.go:105 level=info collector=container
Dec  5 06:54:53 np0005546909 podman[201350]: 2025-12-05 11:54:53.879780819 +0000 UTC m=+0.192945000 container start 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 06:54:53 np0005546909 podman[201350]: podman_exporter
Dec  5 06:54:53 np0005546909 systemd[1]: Starting Podman API Service...
Dec  5 06:54:53 np0005546909 systemd[1]: Started Podman API Service.
Dec  5 06:54:53 np0005546909 systemd[1]: Started podman_exporter container.
Dec  5 06:54:53 np0005546909 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec  5 06:54:53 np0005546909 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="Setting parallel job count to 25"
Dec  5 06:54:53 np0005546909 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="Using sqlite as database backend"
Dec  5 06:54:53 np0005546909 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec  5 06:54:53 np0005546909 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec  5 06:54:53 np0005546909 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Dec  5 06:54:53 np0005546909 podman[201376]: @ - - [05/Dec/2025:11:54:53 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec  5 06:54:53 np0005546909 podman[201376]: time="2025-12-05T11:54:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 06:54:53 np0005546909 podman[201375]: 2025-12-05 11:54:53.957866601 +0000 UTC m=+0.057964885 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 06:54:53 np0005546909 systemd[1]: 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6-3da900843a1ee59b.service: Main process exited, code=exited, status=1/FAILURE
Dec  5 06:54:53 np0005546909 systemd[1]: 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6-3da900843a1ee59b.service: Failed with result 'exit-code'.
Dec  5 06:54:53 np0005546909 podman[201376]: @ - - [05/Dec/2025:11:54:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19566 "" "Go-http-client/1.1"
Dec  5 06:54:53 np0005546909 podman_exporter[201365]: ts=2025-12-05T11:54:53.978Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec  5 06:54:53 np0005546909 podman_exporter[201365]: ts=2025-12-05T11:54:53.979Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec  5 06:54:53 np0005546909 podman_exporter[201365]: ts=2025-12-05T11:54:53.979Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec  5 06:54:54 np0005546909 python3.9[201562]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:54:54 np0005546909 systemd[1]: Stopping podman_exporter container...
Dec  5 06:54:54 np0005546909 podman[201376]: @ - - [05/Dec/2025:11:54:53 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Dec  5 06:54:54 np0005546909 systemd[1]: libpod-55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.scope: Deactivated successfully.
Dec  5 06:54:54 np0005546909 podman[201566]: 2025-12-05 11:54:54.903684116 +0000 UTC m=+0.062539926 container died 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 06:54:54 np0005546909 systemd[1]: 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6-3da900843a1ee59b.timer: Deactivated successfully.
Dec  5 06:54:54 np0005546909 systemd[1]: Stopped /usr/bin/podman healthcheck run 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.
Dec  5 06:54:54 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6-userdata-shm.mount: Deactivated successfully.
Dec  5 06:54:54 np0005546909 systemd[1]: var-lib-containers-storage-overlay-467cb46bad3a0a5e7c1066989acaca15e9231985dac6b8e77c4852d35c2913af-merged.mount: Deactivated successfully.
Dec  5 06:54:55 np0005546909 podman[201566]: 2025-12-05 11:54:55.295206256 +0000 UTC m=+0.454062026 container cleanup 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 06:54:55 np0005546909 podman[201566]: podman_exporter
Dec  5 06:54:55 np0005546909 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec  5 06:54:55 np0005546909 podman[201595]: podman_exporter
Dec  5 06:54:55 np0005546909 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec  5 06:54:55 np0005546909 systemd[1]: Stopped podman_exporter container.
Dec  5 06:54:55 np0005546909 systemd[1]: Starting podman_exporter container...
Dec  5 06:54:55 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:54:55 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/467cb46bad3a0a5e7c1066989acaca15e9231985dac6b8e77c4852d35c2913af/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:55 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/467cb46bad3a0a5e7c1066989acaca15e9231985dac6b8e77c4852d35c2913af/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  5 06:54:55 np0005546909 systemd[1]: Started /usr/bin/podman healthcheck run 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.
Dec  5 06:54:55 np0005546909 podman[201608]: 2025-12-05 11:54:55.522848202 +0000 UTC m=+0.119051259 container init 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 06:54:55 np0005546909 podman_exporter[201623]: ts=2025-12-05T11:54:55.538Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec  5 06:54:55 np0005546909 podman_exporter[201623]: ts=2025-12-05T11:54:55.538Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec  5 06:54:55 np0005546909 podman_exporter[201623]: ts=2025-12-05T11:54:55.538Z caller=handler.go:94 level=info msg="enabled collectors"
Dec  5 06:54:55 np0005546909 podman_exporter[201623]: ts=2025-12-05T11:54:55.538Z caller=handler.go:105 level=info collector=container
Dec  5 06:54:55 np0005546909 podman[201376]: @ - - [05/Dec/2025:11:54:55 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec  5 06:54:55 np0005546909 podman[201376]: time="2025-12-05T11:54:55Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  5 06:54:55 np0005546909 podman[201608]: 2025-12-05 11:54:55.563959653 +0000 UTC m=+0.160162690 container start 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 06:54:55 np0005546909 podman[201608]: podman_exporter
Dec  5 06:54:55 np0005546909 podman[201376]: @ - - [05/Dec/2025:11:54:55 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19568 "" "Go-http-client/1.1"
Dec  5 06:54:55 np0005546909 systemd[1]: Started podman_exporter container.
Dec  5 06:54:55 np0005546909 podman_exporter[201623]: ts=2025-12-05T11:54:55.573Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec  5 06:54:55 np0005546909 podman_exporter[201623]: ts=2025-12-05T11:54:55.573Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec  5 06:54:55 np0005546909 podman_exporter[201623]: ts=2025-12-05T11:54:55.574Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec  5 06:54:55 np0005546909 podman[201633]: 2025-12-05 11:54:55.620030012 +0000 UTC m=+0.047557766 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 06:54:56 np0005546909 python3.9[201807]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:54:56 np0005546909 python3.9[201930]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764935695.781892-748-183864846800840/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  5 06:54:57 np0005546909 python3.9[202082]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec  5 06:54:58 np0005546909 python3.9[202234]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  5 06:54:59 np0005546909 python3[202386]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec  5 06:55:02 np0005546909 podman[202399]: 2025-12-05 11:55:02.263944373 +0000 UTC m=+3.184843050 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec  5 06:55:02 np0005546909 podman[202494]: 2025-12-05 11:55:02.422867225 +0000 UTC m=+0.064128633 container create 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7)
Dec  5 06:55:02 np0005546909 podman[202494]: 2025-12-05 11:55:02.386933763 +0000 UTC m=+0.028195221 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec  5 06:55:02 np0005546909 python3[202386]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec  5 06:55:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:55:03.000 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:55:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:55:03.001 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:55:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:55:03.001 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:55:03 np0005546909 python3.9[202684]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:55:03 np0005546909 python3.9[202838]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:04 np0005546909 podman[202961]: 2025-12-05 11:55:04.509120473 +0000 UTC m=+0.068126417 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  5 06:55:04 np0005546909 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-2261b38cdfea5d2d.service: Main process exited, code=exited, status=1/FAILURE
Dec  5 06:55:04 np0005546909 systemd[1]: 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1-2261b38cdfea5d2d.service: Failed with result 'exit-code'.
Dec  5 06:55:04 np0005546909 python3.9[203008]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764935704.0704522-801-263071331169123/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:05 np0005546909 python3.9[203084]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  5 06:55:05 np0005546909 systemd[1]: Reloading.
Dec  5 06:55:05 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:55:05 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:55:06 np0005546909 python3.9[203194]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  5 06:55:06 np0005546909 systemd[1]: Reloading.
Dec  5 06:55:06 np0005546909 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  5 06:55:06 np0005546909 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  5 06:55:06 np0005546909 systemd[1]: Starting openstack_network_exporter container...
Dec  5 06:55:06 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:55:06 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  5 06:55:06 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  5 06:55:06 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  5 06:55:06 np0005546909 systemd[1]: Started /usr/bin/podman healthcheck run 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.
Dec  5 06:55:06 np0005546909 podman[203235]: 2025-12-05 11:55:06.686415704 +0000 UTC m=+0.139106775 container init 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc.)
Dec  5 06:55:06 np0005546909 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *bridge.Collector
Dec  5 06:55:06 np0005546909 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *coverage.Collector
Dec  5 06:55:06 np0005546909 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *datapath.Collector
Dec  5 06:55:06 np0005546909 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *iface.Collector
Dec  5 06:55:06 np0005546909 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *memory.Collector
Dec  5 06:55:06 np0005546909 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *ovnnorthd.Collector
Dec  5 06:55:06 np0005546909 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *ovn.Collector
Dec  5 06:55:06 np0005546909 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *ovsdbserver.Collector
Dec  5 06:55:06 np0005546909 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *pmd_perf.Collector
Dec  5 06:55:06 np0005546909 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *pmd_rxq.Collector
Dec  5 06:55:06 np0005546909 openstack_network_exporter[203250]: INFO    11:55:06 main.go:48: registering *vswitch.Collector
Dec  5 06:55:06 np0005546909 openstack_network_exporter[203250]: NOTICE  11:55:06 main.go:76: listening on https://:9105/metrics
Dec  5 06:55:06 np0005546909 podman[203235]: 2025-12-05 11:55:06.711075011 +0000 UTC m=+0.163766092 container start 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec  5 06:55:06 np0005546909 podman[203235]: openstack_network_exporter
Dec  5 06:55:06 np0005546909 systemd[1]: Started openstack_network_exporter container.
Dec  5 06:55:06 np0005546909 podman[203260]: 2025-12-05 11:55:06.801203609 +0000 UTC m=+0.076417155 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.)
Dec  5 06:55:07 np0005546909 python3.9[203435]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  5 06:55:07 np0005546909 systemd[1]: Stopping openstack_network_exporter container...
Dec  5 06:55:08 np0005546909 systemd[1]: libpod-1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.scope: Deactivated successfully.
Dec  5 06:55:08 np0005546909 podman[203440]: 2025-12-05 11:55:08.209467991 +0000 UTC m=+0.454671504 container died 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container)
Dec  5 06:55:08 np0005546909 systemd[1]: 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7-28ff159ea7c4e12e.timer: Deactivated successfully.
Dec  5 06:55:08 np0005546909 systemd[1]: Stopped /usr/bin/podman healthcheck run 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.
Dec  5 06:55:08 np0005546909 podman[203439]: 2025-12-05 11:55:08.330586499 +0000 UTC m=+0.578864071 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec  5 06:55:08 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7-userdata-shm.mount: Deactivated successfully.
Dec  5 06:55:08 np0005546909 systemd[1]: var-lib-containers-storage-overlay-34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b-merged.mount: Deactivated successfully.
Dec  5 06:55:09 np0005546909 podman[203440]: 2025-12-05 11:55:09.685980542 +0000 UTC m=+1.931184065 container cleanup 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 06:55:09 np0005546909 podman[203440]: openstack_network_exporter
Dec  5 06:55:09 np0005546909 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec  5 06:55:09 np0005546909 podman[203486]: openstack_network_exporter
Dec  5 06:55:09 np0005546909 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec  5 06:55:09 np0005546909 systemd[1]: Stopped openstack_network_exporter container.
Dec  5 06:55:09 np0005546909 systemd[1]: Starting openstack_network_exporter container...
Dec  5 06:55:09 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:55:09 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  5 06:55:09 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  5 06:55:09 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34da12c020c383f18c79f71ea393aa6d0d87413b239fd34e3dea27b37406f12b/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  5 06:55:09 np0005546909 systemd[1]: Started /usr/bin/podman healthcheck run 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.
Dec  5 06:55:09 np0005546909 podman[203499]: 2025-12-05 11:55:09.902334804 +0000 UTC m=+0.117048812 container init 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 06:55:09 np0005546909 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *bridge.Collector
Dec  5 06:55:09 np0005546909 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *coverage.Collector
Dec  5 06:55:09 np0005546909 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *datapath.Collector
Dec  5 06:55:09 np0005546909 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *iface.Collector
Dec  5 06:55:09 np0005546909 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *memory.Collector
Dec  5 06:55:09 np0005546909 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *ovnnorthd.Collector
Dec  5 06:55:09 np0005546909 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *ovn.Collector
Dec  5 06:55:09 np0005546909 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *ovsdbserver.Collector
Dec  5 06:55:09 np0005546909 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *pmd_perf.Collector
Dec  5 06:55:09 np0005546909 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *pmd_rxq.Collector
Dec  5 06:55:09 np0005546909 openstack_network_exporter[203514]: INFO    11:55:09 main.go:48: registering *vswitch.Collector
Dec  5 06:55:09 np0005546909 openstack_network_exporter[203514]: NOTICE  11:55:09 main.go:76: listening on https://:9105/metrics
Dec  5 06:55:09 np0005546909 podman[203499]: 2025-12-05 11:55:09.933586391 +0000 UTC m=+0.148300419 container start 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, version=9.6, release=1755695350, build-date=2025-08-20T13:12:41)
Dec  5 06:55:09 np0005546909 podman[203499]: openstack_network_exporter
Dec  5 06:55:09 np0005546909 systemd[1]: Started openstack_network_exporter container.
Dec  5 06:55:10 np0005546909 podman[203524]: 2025-12-05 11:55:10.040849031 +0000 UTC m=+0.093178616 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Dec  5 06:55:10 np0005546909 python3.9[203697]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  5 06:55:11 np0005546909 python3.9[203849]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec  5 06:55:12 np0005546909 python3.9[204014]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:12 np0005546909 systemd[1]: Started libpod-conmon-6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698.scope.
Dec  5 06:55:12 np0005546909 podman[204015]: 2025-12-05 11:55:12.70045315 +0000 UTC m=+0.089806839 container exec 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 06:55:12 np0005546909 podman[204015]: 2025-12-05 11:55:12.730783881 +0000 UTC m=+0.120137540 container exec_died 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  5 06:55:12 np0005546909 systemd[1]: libpod-conmon-6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698.scope: Deactivated successfully.
Dec  5 06:55:13 np0005546909 python3.9[204196]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:13 np0005546909 systemd[1]: Started libpod-conmon-6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698.scope.
Dec  5 06:55:13 np0005546909 podman[204197]: 2025-12-05 11:55:13.888105938 +0000 UTC m=+0.263189287 container exec 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec  5 06:55:14 np0005546909 podman[204217]: 2025-12-05 11:55:14.002306187 +0000 UTC m=+0.096751419 container exec_died 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  5 06:55:14 np0005546909 podman[204197]: 2025-12-05 11:55:14.152802068 +0000 UTC m=+0.527885357 container exec_died 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 06:55:14 np0005546909 systemd[1]: libpod-conmon-6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698.scope: Deactivated successfully.
Dec  5 06:55:15 np0005546909 podman[204354]: 2025-12-05 11:55:15.051031516 +0000 UTC m=+0.098694954 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 06:55:15 np0005546909 python3.9[204399]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:15 np0005546909 podman[204530]: 2025-12-05 11:55:15.841973715 +0000 UTC m=+0.134031869 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 06:55:15 np0005546909 python3.9[204577]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec  5 06:55:16 np0005546909 python3.9[204749]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:16 np0005546909 systemd[1]: Started libpod-conmon-de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc.scope.
Dec  5 06:55:17 np0005546909 podman[204750]: 2025-12-05 11:55:17.01109178 +0000 UTC m=+0.236450339 container exec de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 06:55:17 np0005546909 podman[204770]: 2025-12-05 11:55:17.165265417 +0000 UTC m=+0.127221994 container exec_died de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 06:55:17 np0005546909 podman[204750]: 2025-12-05 11:55:17.31235538 +0000 UTC m=+0.537713869 container exec_died de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 06:55:17 np0005546909 systemd[1]: libpod-conmon-de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc.scope: Deactivated successfully.
Dec  5 06:55:18 np0005546909 python3.9[204934]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:18 np0005546909 systemd[1]: Started libpod-conmon-de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc.scope.
Dec  5 06:55:18 np0005546909 podman[204935]: 2025-12-05 11:55:18.338914634 +0000 UTC m=+0.219176904 container exec de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 06:55:18 np0005546909 podman[204955]: 2025-12-05 11:55:18.40321015 +0000 UTC m=+0.051747127 container exec_died de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  5 06:55:18 np0005546909 podman[204935]: 2025-12-05 11:55:18.471756238 +0000 UTC m=+0.352018468 container exec_died de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  5 06:55:18 np0005546909 systemd[1]: libpod-conmon-de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc.scope: Deactivated successfully.
Dec  5 06:55:19 np0005546909 python3.9[205119]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:20 np0005546909 python3.9[205271]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec  5 06:55:20 np0005546909 python3.9[205437]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:21 np0005546909 systemd[1]: Started libpod-conmon-164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.scope.
Dec  5 06:55:21 np0005546909 podman[205438]: 2025-12-05 11:55:21.034458614 +0000 UTC m=+0.081655175 container exec 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 06:55:21 np0005546909 podman[205438]: 2025-12-05 11:55:21.068492001 +0000 UTC m=+0.115688542 container exec_died 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 06:55:21 np0005546909 systemd[1]: libpod-conmon-164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.scope: Deactivated successfully.
Dec  5 06:55:21 np0005546909 podman[205467]: 2025-12-05 11:55:21.198576586 +0000 UTC m=+0.065452450 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 06:55:21 np0005546909 python3.9[205637]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:21 np0005546909 systemd[1]: Started libpod-conmon-164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.scope.
Dec  5 06:55:22 np0005546909 podman[205638]: 2025-12-05 11:55:21.999646366 +0000 UTC m=+0.094526085 container exec 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  5 06:55:22 np0005546909 podman[205658]: 2025-12-05 11:55:22.079313033 +0000 UTC m=+0.063955347 container exec_died 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 06:55:22 np0005546909 podman[205638]: 2025-12-05 11:55:22.086251822 +0000 UTC m=+0.181131541 container exec_died 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 06:55:22 np0005546909 systemd[1]: libpod-conmon-164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb.scope: Deactivated successfully.
Dec  5 06:55:22 np0005546909 python3.9[205822]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:23 np0005546909 python3.9[205974]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Dec  5 06:55:24 np0005546909 python3.9[206139]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:24 np0005546909 systemd[1]: Started libpod-conmon-5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.scope.
Dec  5 06:55:24 np0005546909 podman[206140]: 2025-12-05 11:55:24.323560796 +0000 UTC m=+0.113585132 container exec 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm)
Dec  5 06:55:24 np0005546909 podman[206140]: 2025-12-05 11:55:24.355180164 +0000 UTC m=+0.145204480 container exec_died 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute)
Dec  5 06:55:24 np0005546909 systemd[1]: libpod-conmon-5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.scope: Deactivated successfully.
Dec  5 06:55:25 np0005546909 python3.9[206324]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:25 np0005546909 systemd[1]: Started libpod-conmon-5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.scope.
Dec  5 06:55:25 np0005546909 podman[206325]: 2025-12-05 11:55:25.208579556 +0000 UTC m=+0.096263455 container exec 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm)
Dec  5 06:55:25 np0005546909 podman[206325]: 2025-12-05 11:55:25.243419146 +0000 UTC m=+0.131102955 container exec_died 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  5 06:55:25 np0005546909 systemd[1]: libpod-conmon-5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1.scope: Deactivated successfully.
Dec  5 06:55:25 np0005546909 podman[206478]: 2025-12-05 11:55:25.827490175 +0000 UTC m=+0.063377861 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 06:55:26 np0005546909 python3.9[206530]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:26 np0005546909 python3.9[206682]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Dec  5 06:55:27 np0005546909 python3.9[206847]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:27 np0005546909 systemd[1]: Started libpod-conmon-5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.scope.
Dec  5 06:55:27 np0005546909 podman[206848]: 2025-12-05 11:55:27.763672683 +0000 UTC m=+0.092244689 container exec 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 06:55:27 np0005546909 podman[206848]: 2025-12-05 11:55:27.800952994 +0000 UTC m=+0.129525020 container exec_died 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 06:55:27 np0005546909 systemd[1]: libpod-conmon-5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.scope: Deactivated successfully.
Dec  5 06:55:28 np0005546909 python3.9[207031]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:29 np0005546909 systemd[1]: Started libpod-conmon-5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.scope.
Dec  5 06:55:29 np0005546909 podman[207032]: 2025-12-05 11:55:29.056243734 +0000 UTC m=+0.555236362 container exec 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 06:55:29 np0005546909 podman[207052]: 2025-12-05 11:55:29.151239602 +0000 UTC m=+0.076054495 container exec_died 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 06:55:29 np0005546909 podman[207032]: 2025-12-05 11:55:29.157148681 +0000 UTC m=+0.656141279 container exec_died 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 06:55:29 np0005546909 systemd[1]: libpod-conmon-5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5.scope: Deactivated successfully.
Dec  5 06:55:29 np0005546909 python3.9[207216]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:30 np0005546909 nova_compute[187208]: 2025-12-05 11:55:30.360 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:55:30 np0005546909 nova_compute[187208]: 2025-12-05 11:55:30.384 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:55:30 np0005546909 nova_compute[187208]: 2025-12-05 11:55:30.384 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:55:30 np0005546909 python3.9[207368]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.077 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.078 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.078 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.078 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.102 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.102 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.103 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.103 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.262 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.263 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5948MB free_disk=73.37312698364258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.263 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.263 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:55:31 np0005546909 python3.9[207533]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.489 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.489 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.507 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.526 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.527 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 06:55:31 np0005546909 nova_compute[187208]: 2025-12-05 11:55:31.527 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:55:31 np0005546909 systemd[1]: Started libpod-conmon-55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.scope.
Dec  5 06:55:31 np0005546909 podman[207534]: 2025-12-05 11:55:31.616415608 +0000 UTC m=+0.322661475 container exec 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 06:55:31 np0005546909 podman[207554]: 2025-12-05 11:55:31.787325195 +0000 UTC m=+0.151016327 container exec_died 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 06:55:31 np0005546909 podman[207534]: 2025-12-05 11:55:31.908089942 +0000 UTC m=+0.614335799 container exec_died 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 06:55:31 np0005546909 systemd[1]: libpod-conmon-55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.scope: Deactivated successfully.
Dec  5 06:55:32 np0005546909 nova_compute[187208]: 2025-12-05 11:55:32.509 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:55:32 np0005546909 nova_compute[187208]: 2025-12-05 11:55:32.509 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:55:32 np0005546909 nova_compute[187208]: 2025-12-05 11:55:32.509 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 06:55:33 np0005546909 python3.9[207718]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:33 np0005546909 systemd[1]: Started libpod-conmon-55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.scope.
Dec  5 06:55:33 np0005546909 podman[207719]: 2025-12-05 11:55:33.419169877 +0000 UTC m=+0.282574494 container exec 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 06:55:33 np0005546909 podman[207738]: 2025-12-05 11:55:33.499269456 +0000 UTC m=+0.066296364 container exec_died 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 06:55:33 np0005546909 podman[207719]: 2025-12-05 11:55:33.505155925 +0000 UTC m=+0.368560502 container exec_died 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 06:55:33 np0005546909 systemd[1]: libpod-conmon-55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6.scope: Deactivated successfully.
Dec  5 06:55:34 np0005546909 python3.9[207902]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:34 np0005546909 podman[208026]: 2025-12-05 11:55:34.796386187 +0000 UTC m=+0.072866923 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 06:55:34 np0005546909 python3.9[208068]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec  5 06:55:35 np0005546909 python3.9[208238]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:35 np0005546909 systemd[1]: Started libpod-conmon-1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.scope.
Dec  5 06:55:35 np0005546909 podman[208239]: 2025-12-05 11:55:35.853790076 +0000 UTC m=+0.110750811 container exec 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Dec  5 06:55:35 np0005546909 podman[208239]: 2025-12-05 11:55:35.89224879 +0000 UTC m=+0.149209505 container exec_died 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=edpm, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Dec  5 06:55:35 np0005546909 systemd[1]: libpod-conmon-1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.scope: Deactivated successfully.
Dec  5 06:55:36 np0005546909 python3.9[208422]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  5 06:55:36 np0005546909 systemd[1]: Started libpod-conmon-1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.scope.
Dec  5 06:55:36 np0005546909 podman[208423]: 2025-12-05 11:55:36.747171345 +0000 UTC m=+0.093539766 container exec 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm)
Dec  5 06:55:36 np0005546909 podman[208423]: 2025-12-05 11:55:36.751888351 +0000 UTC m=+0.098256752 container exec_died 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, release=1755695350, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 06:55:36 np0005546909 systemd[1]: libpod-conmon-1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7.scope: Deactivated successfully.
Dec  5 06:55:37 np0005546909 python3.9[208606]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:38 np0005546909 python3.9[208758]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:38 np0005546909 podman[208882]: 2025-12-05 11:55:38.616855834 +0000 UTC m=+0.079175264 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 06:55:38 np0005546909 python3.9[208921]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:55:39 np0005546909 python3.9[209049]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764935738.2423933-1082-19824319945370/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:40 np0005546909 python3.9[209201]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:40 np0005546909 podman[209202]: 2025-12-05 11:55:40.22897386 +0000 UTC m=+0.070393573 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Dec  5 06:55:40 np0005546909 python3.9[209374]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:55:41 np0005546909 python3.9[209452]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:41 np0005546909 python3.9[209604]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:55:42 np0005546909 python3.9[209682]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.wz5evwrs recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:43 np0005546909 python3.9[209834]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:55:43 np0005546909 python3.9[209912]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:44 np0005546909 python3.9[210064]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:55:45 np0005546909 python3[210217]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  5 06:55:45 np0005546909 podman[210218]: 2025-12-05 11:55:45.208155475 +0000 UTC m=+0.060630932 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 06:55:45 np0005546909 python3.9[210392]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:55:46 np0005546909 podman[210442]: 2025-12-05 11:55:46.147409491 +0000 UTC m=+0.105655485 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 06:55:46 np0005546909 python3.9[210485]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:47 np0005546909 python3.9[210649]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:55:47 np0005546909 python3.9[210727]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:48 np0005546909 python3.9[210879]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:55:48 np0005546909 python3.9[210957]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:49 np0005546909 python3.9[211109]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:55:50 np0005546909 python3.9[211187]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:50 np0005546909 python3.9[211339]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  5 06:55:51 np0005546909 python3.9[211464]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764935750.2456598-1207-140936292841325/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:51 np0005546909 podman[211588]: 2025-12-05 11:55:51.904886413 +0000 UTC m=+0.068760087 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 06:55:52 np0005546909 python3.9[211632]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:52 np0005546909 python3.9[211787]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:55:53 np0005546909 python3.9[211942]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:54 np0005546909 python3.9[212094]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:55:54 np0005546909 python3.9[212247]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  5 06:55:55 np0005546909 python3.9[212401]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  5 06:55:56 np0005546909 podman[212528]: 2025-12-05 11:55:56.129065749 +0000 UTC m=+0.074218706 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 06:55:56 np0005546909 python3.9[212580]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  5 06:55:56 np0005546909 systemd[1]: session-25.scope: Deactivated successfully.
Dec  5 06:55:56 np0005546909 systemd[1]: session-25.scope: Consumed 1min 41.272s CPU time.
Dec  5 06:55:56 np0005546909 systemd-logind[792]: Session 25 logged out. Waiting for processes to exit.
Dec  5 06:55:56 np0005546909 systemd-logind[792]: Removed session 25.
Dec  5 06:56:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:56:03.002 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:56:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:56:03.002 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:56:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:56:03.002 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:56:05 np0005546909 podman[212605]: 2025-12-05 11:56:05.201764718 +0000 UTC m=+0.057853283 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec  5 06:56:09 np0005546909 podman[212625]: 2025-12-05 11:56:09.208359998 +0000 UTC m=+0.057988166 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 06:56:11 np0005546909 podman[212645]: 2025-12-05 11:56:11.196753482 +0000 UTC m=+0.055373881 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container)
Dec  5 06:56:16 np0005546909 podman[212666]: 2025-12-05 11:56:16.21017849 +0000 UTC m=+0.068929152 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 06:56:16 np0005546909 podman[212691]: 2025-12-05 11:56:16.350828434 +0000 UTC m=+0.110372270 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 06:56:22 np0005546909 podman[212720]: 2025-12-05 11:56:22.206146894 +0000 UTC m=+0.061947911 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 06:56:27 np0005546909 podman[212740]: 2025-12-05 11:56:27.244682047 +0000 UTC m=+0.087930262 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.080 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.081 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.108 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.108 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.108 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.108 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.311 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.313 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6011MB free_disk=73.37332534790039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.313 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.314 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.405 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.406 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.429 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.446 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.449 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 06:56:31 np0005546909 nova_compute[187208]: 2025-12-05 11:56:31.449 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:56:32 np0005546909 nova_compute[187208]: 2025-12-05 11:56:32.428 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:56:32 np0005546909 nova_compute[187208]: 2025-12-05 11:56:32.429 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:56:33 np0005546909 nova_compute[187208]: 2025-12-05 11:56:33.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:56:33 np0005546909 nova_compute[187208]: 2025-12-05 11:56:33.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:56:34 np0005546909 nova_compute[187208]: 2025-12-05 11:56:34.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:56:34 np0005546909 nova_compute[187208]: 2025-12-05 11:56:34.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:56:34 np0005546909 nova_compute[187208]: 2025-12-05 11:56:34.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 06:56:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:56:35.203 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 06:56:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:56:35.204 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 06:56:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:56:35.205 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:56:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:56:36 np0005546909 podman[212764]: 2025-12-05 11:56:36.232893036 +0000 UTC m=+0.089602470 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 06:56:40 np0005546909 podman[212784]: 2025-12-05 11:56:40.205089342 +0000 UTC m=+0.063791154 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec  5 06:56:42 np0005546909 podman[212802]: 2025-12-05 11:56:42.19081911 +0000 UTC m=+0.049666596 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 06:56:47 np0005546909 podman[212824]: 2025-12-05 11:56:47.201307971 +0000 UTC m=+0.055830784 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 06:56:47 np0005546909 podman[212825]: 2025-12-05 11:56:47.264508187 +0000 UTC m=+0.114205670 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 06:56:53 np0005546909 podman[212870]: 2025-12-05 11:56:53.209339159 +0000 UTC m=+0.061615180 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 06:56:58 np0005546909 podman[212890]: 2025-12-05 11:56:58.21934809 +0000 UTC m=+0.067210328 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 06:57:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:57:03.003 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:57:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:57:03.003 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:57:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:57:03.003 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:57:07 np0005546909 podman[212915]: 2025-12-05 11:57:07.228006261 +0000 UTC m=+0.074484283 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec  5 06:57:11 np0005546909 podman[212935]: 2025-12-05 11:57:11.232475343 +0000 UTC m=+0.086815280 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 06:57:13 np0005546909 podman[212955]: 2025-12-05 11:57:13.205846426 +0000 UTC m=+0.064495501 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=)
Dec  5 06:57:18 np0005546909 podman[212978]: 2025-12-05 11:57:18.206814493 +0000 UTC m=+0.058588925 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 06:57:18 np0005546909 podman[212979]: 2025-12-05 11:57:18.236624154 +0000 UTC m=+0.089641112 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 06:57:24 np0005546909 podman[213026]: 2025-12-05 11:57:24.219060959 +0000 UTC m=+0.069736219 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Dec  5 06:57:29 np0005546909 podman[213046]: 2025-12-05 11:57:29.217710588 +0000 UTC m=+0.065266942 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 06:57:31 np0005546909 nova_compute[187208]: 2025-12-05 11:57:31.057 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:57:31 np0005546909 nova_compute[187208]: 2025-12-05 11:57:31.862 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:57:31 np0005546909 nova_compute[187208]: 2025-12-05 11:57:31.862 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 06:57:31 np0005546909 nova_compute[187208]: 2025-12-05 11:57:31.862 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.028 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.028 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.322 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.322 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.322 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.322 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.477 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.478 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6051MB free_disk=73.37330627441406GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.478 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.479 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.559 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.560 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.587 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.605 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.606 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 06:57:32 np0005546909 nova_compute[187208]: 2025-12-05 11:57:32.607 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:57:33 np0005546909 nova_compute[187208]: 2025-12-05 11:57:33.609 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:57:34 np0005546909 nova_compute[187208]: 2025-12-05 11:57:34.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:57:34 np0005546909 nova_compute[187208]: 2025-12-05 11:57:34.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:57:34 np0005546909 nova_compute[187208]: 2025-12-05 11:57:34.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:57:34 np0005546909 nova_compute[187208]: 2025-12-05 11:57:34.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:57:34 np0005546909 nova_compute[187208]: 2025-12-05 11:57:34.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 06:57:38 np0005546909 podman[213072]: 2025-12-05 11:57:38.210217674 +0000 UTC m=+0.062728132 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec  5 06:57:42 np0005546909 podman[213093]: 2025-12-05 11:57:42.217081695 +0000 UTC m=+0.072836897 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  5 06:57:44 np0005546909 podman[213112]: 2025-12-05 11:57:44.194956593 +0000 UTC m=+0.049314373 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  5 06:57:49 np0005546909 podman[213133]: 2025-12-05 11:57:49.353737033 +0000 UTC m=+0.191976739 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 06:57:49 np0005546909 podman[213134]: 2025-12-05 11:57:49.429850621 +0000 UTC m=+0.158327770 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 06:57:55 np0005546909 podman[213184]: 2025-12-05 11:57:55.217869589 +0000 UTC m=+0.069006759 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 06:58:00 np0005546909 podman[213204]: 2025-12-05 11:58:00.223099581 +0000 UTC m=+0.074885795 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 06:58:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:58:03.004 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:58:03.004 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:58:03.004 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:09 np0005546909 podman[213230]: 2025-12-05 11:58:09.217890112 +0000 UTC m=+0.067926413 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec  5 06:58:13 np0005546909 podman[213250]: 2025-12-05 11:58:13.215945799 +0000 UTC m=+0.065438074 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 06:58:15 np0005546909 podman[213269]: 2025-12-05 11:58:15.222840604 +0000 UTC m=+0.068543280 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec  5 06:58:20 np0005546909 podman[213294]: 2025-12-05 11:58:20.205906889 +0000 UTC m=+0.061255289 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 06:58:20 np0005546909 podman[213295]: 2025-12-05 11:58:20.248275777 +0000 UTC m=+0.092486400 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 06:58:26 np0005546909 podman[213341]: 2025-12-05 11:58:26.216890608 +0000 UTC m=+0.068190340 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  5 06:58:30 np0005546909 nova_compute[187208]: 2025-12-05 11:58:30.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:58:30 np0005546909 nova_compute[187208]: 2025-12-05 11:58:30.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 06:58:30 np0005546909 nova_compute[187208]: 2025-12-05 11:58:30.079 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 06:58:30 np0005546909 nova_compute[187208]: 2025-12-05 11:58:30.080 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:58:30 np0005546909 nova_compute[187208]: 2025-12-05 11:58:30.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 06:58:30 np0005546909 nova_compute[187208]: 2025-12-05 11:58:30.142 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:58:31 np0005546909 nova_compute[187208]: 2025-12-05 11:58:31.151 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:58:31 np0005546909 nova_compute[187208]: 2025-12-05 11:58:31.151 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 06:58:31 np0005546909 nova_compute[187208]: 2025-12-05 11:58:31.151 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 06:58:31 np0005546909 nova_compute[187208]: 2025-12-05 11:58:31.197 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 06:58:31 np0005546909 nova_compute[187208]: 2025-12-05 11:58:31.197 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:58:31 np0005546909 podman[213361]: 2025-12-05 11:58:31.220721756 +0000 UTC m=+0.068224922 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 06:58:33 np0005546909 nova_compute[187208]: 2025-12-05 11:58:33.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.087 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.087 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.088 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.261 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.262 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6059MB free_disk=73.37367248535156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.262 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.262 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.451 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.451 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.517 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.610 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.611 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.645 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.673 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.695 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.718 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.720 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 06:58:34 np0005546909 nova_compute[187208]: 2025-12-05 11:58:34.721 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:58:34.802 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 06:58:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:58:34.804 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 11:58:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 06:58:35 np0005546909 nova_compute[187208]: 2025-12-05 11:58:35.720 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:58:35 np0005546909 nova_compute[187208]: 2025-12-05 11:58:35.721 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:58:35 np0005546909 nova_compute[187208]: 2025-12-05 11:58:35.721 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:58:35 np0005546909 nova_compute[187208]: 2025-12-05 11:58:35.721 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 06:58:36 np0005546909 nova_compute[187208]: 2025-12-05 11:58:36.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:58:36 np0005546909 nova_compute[187208]: 2025-12-05 11:58:36.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.612 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.612 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.630 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.712 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "c9498e91-01c5-47f7-b3ba-6291bb43635d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.713 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c9498e91-01c5-47f7-b3ba-6291bb43635d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.731 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.734 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.734 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.741 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.741 187212 INFO nova.compute.claims [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.841 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.932 187212 DEBUG nova.compute.provider_tree [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.955 187212 DEBUG nova.scheduler.client.report [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.976 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.976 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.978 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.983 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 06:58:37 np0005546909 nova_compute[187208]: 2025-12-05 11:58:37.983 187212 INFO nova.compute.claims [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.055 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.077 187212 INFO nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.098 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.157 187212 DEBUG nova.compute.provider_tree [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.227 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.228 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.229 187212 INFO nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Creating image(s)#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.230 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "/var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.230 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.231 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.232 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.232 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.259 187212 DEBUG nova.scheduler.client.report [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.286 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.286 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.345 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.346 187212 DEBUG nova.network.neutron [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.370 187212 INFO nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.388 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.481 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.483 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.484 187212 INFO nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Creating image(s)#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.484 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "/var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.485 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "/var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.485 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "/var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:38 np0005546909 nova_compute[187208]: 2025-12-05 11:58:38.486 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:39 np0005546909 nova_compute[187208]: 2025-12-05 11:58:39.821 187212 DEBUG nova.network.neutron [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  5 06:58:39 np0005546909 nova_compute[187208]: 2025-12-05 11:58:39.821 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 06:58:40 np0005546909 podman[213386]: 2025-12-05 11:58:40.561277907 +0000 UTC m=+0.401903529 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 06:58:41 np0005546909 nova_compute[187208]: 2025-12-05 11:58:41.374 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:41 np0005546909 nova_compute[187208]: 2025-12-05 11:58:41.424 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.part --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:41 np0005546909 nova_compute[187208]: 2025-12-05 11:58:41.426 187212 DEBUG nova.virt.images [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] a6987852-063f-405d-a848-6b382694811e was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Dec  5 06:58:41 np0005546909 nova_compute[187208]: 2025-12-05 11:58:41.426 187212 DEBUG nova.privsep.utils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 06:58:41 np0005546909 nova_compute[187208]: 2025-12-05 11:58:41.427 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.part /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:41 np0005546909 nova_compute[187208]: 2025-12-05 11:58:41.649 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.part /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.converted" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:41 np0005546909 nova_compute[187208]: 2025-12-05 11:58:41.653 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:41 np0005546909 nova_compute[187208]: 2025-12-05 11:58:41.715 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524.converted --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:41 np0005546909 nova_compute[187208]: 2025-12-05 11:58:41.716 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:41 np0005546909 nova_compute[187208]: 2025-12-05 11:58:41.733 187212 INFO oslo.privsep.daemon [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpllntry_p/privsep.sock']#033[00m
Dec  5 06:58:41 np0005546909 nova_compute[187208]: 2025-12-05 11:58:41.735 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 3.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:41 np0005546909 nova_compute[187208]: 2025-12-05 11:58:41.735 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.464 187212 INFO oslo.privsep.daemon [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.320 213424 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.324 213424 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.326 213424 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.326 213424 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213424#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.468 187212 WARNING oslo_privsep.priv_context [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] privsep daemon already running#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.570 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.582 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.622 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.624 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.625 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.636 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.651 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.653 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.695 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.696 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.745 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.747 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.748 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.764 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.786 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.805 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.806 187212 DEBUG nova.virt.disk.api [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Checking if we can resize image /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.806 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:58:42.806 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.858 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.859 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.872 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.873 187212 DEBUG nova.virt.disk.api [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Cannot resize image /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.874 187212 DEBUG nova.objects.instance [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'migration_context' on Instance uuid caa6c7c3-7eb3-4636-a7ad-7b605ef393ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.887 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.888 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.888 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.902 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.903 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Ensure instance console log exists: /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.903 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.903 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.904 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.906 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.910 187212 WARNING nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.918 187212 DEBUG nova.virt.libvirt.host [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.919 187212 DEBUG nova.virt.libvirt.host [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.923 187212 DEBUG nova.virt.libvirt.host [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.923 187212 DEBUG nova.virt.libvirt.host [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.924 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.924 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.925 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.925 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.925 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.926 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.926 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.926 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.927 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.927 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.927 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.927 187212 DEBUG nova.virt.hardware [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.931 187212 DEBUG nova.privsep.utils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.932 187212 DEBUG nova.objects.instance [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'pci_devices' on Instance uuid caa6c7c3-7eb3-4636-a7ad-7b605ef393ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.941 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.942 187212 DEBUG nova.virt.disk.api [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Checking if we can resize image /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.942 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:42 np0005546909 nova_compute[187208]: 2025-12-05 11:58:42.987 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] End _get_guest_xml xml=<domain type="kvm">
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  <uuid>caa6c7c3-7eb3-4636-a7ad-7b605ef393ba</uuid>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  <name>instance-00000001</name>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <nova:name>tempest-AutoAllocateNetworkTest-server-2092831344</nova:name>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 11:58:42</nova:creationTime>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 06:58:42 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:        <nova:user uuid="c4c62f22ba09455995ea1bde6a93431e">tempest-AutoAllocateNetworkTest-275048159-project-member</nova:user>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:        <nova:project uuid="fb2c9c006bee4723bc8dd108e19a6728">tempest-AutoAllocateNetworkTest-275048159</nova:project>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <system>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <entry name="serial">caa6c7c3-7eb3-4636-a7ad-7b605ef393ba</entry>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <entry name="uuid">caa6c7c3-7eb3-4636-a7ad-7b605ef393ba</entry>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    </system>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  <os>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  </clock>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.config"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/console.log" append="off"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    </serial>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <video>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 06:58:42 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 06:58:42 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:58:42 np0005546909 nova_compute[187208]: </domain>
Dec  5 06:58:42 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.010 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.011 187212 DEBUG nova.virt.disk.api [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Cannot resize image /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.011 187212 DEBUG nova.objects.instance [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lazy-loading 'migration_context' on Instance uuid c9498e91-01c5-47f7-b3ba-6291bb43635d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.026 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.026 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Ensure instance console log exists: /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.027 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.027 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.027 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.029 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.033 187212 WARNING nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.037 187212 DEBUG nova.virt.libvirt.host [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.037 187212 DEBUG nova.virt.libvirt.host [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.040 187212 DEBUG nova.virt.libvirt.host [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.040 187212 DEBUG nova.virt.libvirt.host [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.040 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.041 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.041 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.041 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.042 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.042 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.042 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.042 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.043 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.043 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.043 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.043 187212 DEBUG nova.virt.hardware [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.047 187212 DEBUG nova.objects.instance [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lazy-loading 'pci_devices' on Instance uuid c9498e91-01c5-47f7-b3ba-6291bb43635d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.053 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.054 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.054 187212 INFO nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Using config drive#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.065 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] End _get_guest_xml xml=<domain type="kvm">
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  <uuid>c9498e91-01c5-47f7-b3ba-6291bb43635d</uuid>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  <name>instance-00000002</name>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <nova:name>tempest-DeleteServersAdminTestJSON-server-640270599</nova:name>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 11:58:43</nova:creationTime>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 06:58:43 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:        <nova:user uuid="e2dbb72c61fa4cdfa0de840c11264065">tempest-DeleteServersAdminTestJSON-1088655224-project-member</nova:user>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:        <nova:project uuid="43e0982f67c94ddb8da10556d22f6e39">tempest-DeleteServersAdminTestJSON-1088655224</nova:project>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <system>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <entry name="serial">c9498e91-01c5-47f7-b3ba-6291bb43635d</entry>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <entry name="uuid">c9498e91-01c5-47f7-b3ba-6291bb43635d</entry>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    </system>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  <os>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  </clock>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.config"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/console.log" append="off"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    </serial>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <video>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 06:58:43 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 06:58:43 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:58:43 np0005546909 nova_compute[187208]: </domain>
Dec  5 06:58:43 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.116 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.116 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:58:43 np0005546909 nova_compute[187208]: 2025-12-05 11:58:43.117 187212 INFO nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Using config drive#033[00m
Dec  5 06:58:44 np0005546909 podman[213458]: 2025-12-05 11:58:44.248511756 +0000 UTC m=+0.079633705 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.298 187212 INFO nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Creating config drive at /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.config#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.303 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpthw70lrf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.329 187212 INFO nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Creating config drive at /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.config#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.335 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpigrnvx57 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.426 187212 DEBUG oslo_concurrency.processutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpthw70lrf" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.461 187212 DEBUG oslo_concurrency.processutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpigrnvx57" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:44 np0005546909 systemd-machined[153543]: New machine qemu-1-instance-00000001.
Dec  5 06:58:44 np0005546909 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Dec  5 06:58:44 np0005546909 systemd-machined[153543]: New machine qemu-2-instance-00000002.
Dec  5 06:58:44 np0005546909 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.858 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935924.8576803, c9498e91-01c5-47f7-b3ba-6291bb43635d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.859 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] VM Resumed (Lifecycle Event)#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.862 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.862 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.866 187212 INFO nova.virt.libvirt.driver [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Instance spawned successfully.#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.866 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.923 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.928 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.932 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.932 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.933 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.933 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.934 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.934 187212 DEBUG nova.virt.libvirt.driver [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.965 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.966 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935924.8586826, c9498e91-01c5-47f7-b3ba-6291bb43635d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:58:44 np0005546909 nova_compute[187208]: 2025-12-05 11:58:44.966 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] VM Started (Lifecycle Event)#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.003 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.007 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.014 187212 INFO nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Took 6.53 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.015 187212 DEBUG nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.029 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.040 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935925.0404015, caa6c7c3-7eb3-4636-a7ad-7b605ef393ba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.040 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] VM Resumed (Lifecycle Event)#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.042 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.042 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.056 187212 INFO nova.virt.libvirt.driver [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance spawned successfully.#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.056 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.073 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.082 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.089 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.090 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.090 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.091 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.091 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.092 187212 DEBUG nova.virt.libvirt.driver [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.100 187212 INFO nova.compute.manager [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Took 7.27 seconds to build instance.#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.121 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.121 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935925.041058, caa6c7c3-7eb3-4636-a7ad-7b605ef393ba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.122 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] VM Started (Lifecycle Event)#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.157 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.158 187212 DEBUG oslo_concurrency.lockutils [None req-9a64530b-9246-4d3f-89fb-09cdacdd6eb4 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c9498e91-01c5-47f7-b3ba-6291bb43635d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.445s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.161 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.165 187212 INFO nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Took 6.94 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.165 187212 DEBUG nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.176 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.217 187212 INFO nova.compute.manager [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Took 7.51 seconds to build instance.#033[00m
Dec  5 06:58:45 np0005546909 nova_compute[187208]: 2025-12-05 11:58:45.231 187212 DEBUG oslo_concurrency.lockutils [None req-bc47fffb-c62c-4ddc-82ba-19543f3d95ee c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:46 np0005546909 podman[213530]: 2025-12-05 11:58:46.37179 +0000 UTC m=+0.061986900 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vcs-type=git, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Dec  5 06:58:47 np0005546909 nova_compute[187208]: 2025-12-05 11:58:47.210 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Acquiring lock "c9498e91-01c5-47f7-b3ba-6291bb43635d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:47 np0005546909 nova_compute[187208]: 2025-12-05 11:58:47.212 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lock "c9498e91-01c5-47f7-b3ba-6291bb43635d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:47 np0005546909 nova_compute[187208]: 2025-12-05 11:58:47.213 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Acquiring lock "c9498e91-01c5-47f7-b3ba-6291bb43635d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:47 np0005546909 nova_compute[187208]: 2025-12-05 11:58:47.213 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lock "c9498e91-01c5-47f7-b3ba-6291bb43635d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:47 np0005546909 nova_compute[187208]: 2025-12-05 11:58:47.213 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lock "c9498e91-01c5-47f7-b3ba-6291bb43635d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:47 np0005546909 nova_compute[187208]: 2025-12-05 11:58:47.214 187212 INFO nova.compute.manager [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Terminating instance#033[00m
Dec  5 06:58:47 np0005546909 nova_compute[187208]: 2025-12-05 11:58:47.215 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Acquiring lock "refresh_cache-c9498e91-01c5-47f7-b3ba-6291bb43635d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:58:47 np0005546909 nova_compute[187208]: 2025-12-05 11:58:47.215 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Acquired lock "refresh_cache-c9498e91-01c5-47f7-b3ba-6291bb43635d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:58:47 np0005546909 nova_compute[187208]: 2025-12-05 11:58:47.216 187212 DEBUG nova.network.neutron [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 06:58:47 np0005546909 nova_compute[187208]: 2025-12-05 11:58:47.570 187212 DEBUG nova.network.neutron [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.223 187212 DEBUG nova.network.neutron [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.237 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Releasing lock "refresh_cache-c9498e91-01c5-47f7-b3ba-6291bb43635d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.238 187212 DEBUG nova.compute.manager [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 06:58:48 np0005546909 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec  5 06:58:48 np0005546909 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 3.673s CPU time.
Dec  5 06:58:48 np0005546909 systemd-machined[153543]: Machine qemu-2-instance-00000002 terminated.
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.487 187212 INFO nova.virt.libvirt.driver [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Instance destroyed successfully.#033[00m
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.488 187212 DEBUG nova.objects.instance [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lazy-loading 'resources' on Instance uuid c9498e91-01c5-47f7-b3ba-6291bb43635d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.514 187212 INFO nova.virt.libvirt.driver [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Deleting instance files /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d_del#033[00m
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.516 187212 INFO nova.virt.libvirt.driver [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Deletion of /var/lib/nova/instances/c9498e91-01c5-47f7-b3ba-6291bb43635d_del complete#033[00m
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.600 187212 DEBUG nova.virt.libvirt.host [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.601 187212 INFO nova.virt.libvirt.host [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] UEFI support detected#033[00m
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.602 187212 INFO nova.compute.manager [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.603 187212 DEBUG oslo.service.loopingcall [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.603 187212 DEBUG nova.compute.manager [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 06:58:48 np0005546909 nova_compute[187208]: 2025-12-05 11:58:48.604 187212 DEBUG nova.network.neutron [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.222 187212 DEBUG nova.network.neutron [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.250 187212 DEBUG nova.network.neutron [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.276 187212 INFO nova.compute.manager [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Took 0.67 seconds to deallocate network for instance.#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.349 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.350 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.428 187212 DEBUG nova.compute.provider_tree [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.462 187212 ERROR nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] [req-0090b723-cf18-45d2-bdaf-3a4affcbea42] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 5111707b-bdc3-4252-b5b7-b3e96ff05344.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-0090b723-cf18-45d2-bdaf-3a4affcbea42"}]}#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.477 187212 DEBUG nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.494 187212 DEBUG nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.495 187212 DEBUG nova.compute.provider_tree [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.514 187212 DEBUG nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.538 187212 DEBUG nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.611 187212 DEBUG nova.compute.provider_tree [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.656 187212 DEBUG nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updated inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with generation 6 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.657 187212 DEBUG nova.compute.provider_tree [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updating resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 generation from 6 to 7 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.658 187212 DEBUG nova.compute.provider_tree [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.689 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.727 187212 INFO nova.scheduler.client.report [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Deleted allocations for instance c9498e91-01c5-47f7-b3ba-6291bb43635d#033[00m
Dec  5 06:58:49 np0005546909 nova_compute[187208]: 2025-12-05 11:58:49.808 187212 DEBUG oslo_concurrency.lockutils [None req-5ee4983d-adf4-4a6f-bdbc-98d9497e8f17 96490efa38844aa99ad29545e38fa756 2827016477f6427a86121426c777a037 - - default default] Lock "c9498e91-01c5-47f7-b3ba-6291bb43635d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:51 np0005546909 podman[213560]: 2025-12-05 11:58:51.238949331 +0000 UTC m=+0.083415680 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 06:58:51 np0005546909 podman[213561]: 2025-12-05 11:58:51.2628377 +0000 UTC m=+0.099037981 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.724 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.725 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.752 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.807 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.808 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.822 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.822 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.825 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.829 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.829 187212 INFO nova.compute.claims [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.853 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.854 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.891 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.892 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.894 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.919 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:53 np0005546909 nova_compute[187208]: 2025-12-05 11:58:53.924 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.032 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.034 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.071 187212 DEBUG nova.compute.provider_tree [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.088 187212 DEBUG nova.scheduler.client.report [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.115 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.117 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.121 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.130 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.131 187212 INFO nova.compute.claims [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.172 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.173 187212 DEBUG nova.network.neutron [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.295 187212 INFO nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.322 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.415 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.416 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.417 187212 INFO nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Creating image(s)#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.417 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "/var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.418 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "/var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.419 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "/var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.432 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.458 187212 DEBUG nova.compute.provider_tree [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.477 187212 DEBUG nova.scheduler.client.report [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.497 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.498 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.500 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.506 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.507 187212 INFO nova.compute.claims [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.510 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.511 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.511 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.526 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.565 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.566 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.594 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.603 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.604 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.623 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.645 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.646 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.646 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.697 187212 DEBUG nova.network.neutron [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.697 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.698 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.699 187212 DEBUG nova.virt.disk.api [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Checking if we can resize image /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.699 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.723 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.725 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.726 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Creating image(s)#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.727 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.727 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.728 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.746 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.759 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.761 187212 DEBUG nova.virt.disk.api [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Cannot resize image /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.761 187212 DEBUG nova.objects.instance [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lazy-loading 'migration_context' on Instance uuid 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.769 187212 DEBUG nova.compute.provider_tree [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.782 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.783 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Ensure instance console log exists: /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.784 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.784 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.785 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.786 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.788 187212 DEBUG nova.scheduler.client.report [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.795 187212 WARNING nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.799 187212 DEBUG nova.virt.libvirt.host [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.799 187212 DEBUG nova.virt.libvirt.host [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.802 187212 DEBUG nova.virt.libvirt.host [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.803 187212 DEBUG nova.virt.libvirt.host [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.804 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.804 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.805 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.805 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.805 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.806 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.806 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.806 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.807 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.807 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.807 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.808 187212 DEBUG nova.virt.hardware [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.810 187212 DEBUG nova.objects.instance [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.814 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.815 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.815 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.817 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.818 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.828 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.844 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] End _get_guest_xml xml=<domain type="kvm">
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  <uuid>7af3a9ff-9ca1-4bf2-8688-01d7161ba33a</uuid>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  <name>instance-00000004</name>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <nova:name>tempest-DeleteServersAdminTestJSON-server-410072514</nova:name>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 11:58:54</nova:creationTime>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 06:58:54 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:        <nova:user uuid="e2dbb72c61fa4cdfa0de840c11264065">tempest-DeleteServersAdminTestJSON-1088655224-project-member</nova:user>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:        <nova:project uuid="43e0982f67c94ddb8da10556d22f6e39">tempest-DeleteServersAdminTestJSON-1088655224</nova:project>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <system>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <entry name="serial">7af3a9ff-9ca1-4bf2-8688-01d7161ba33a</entry>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <entry name="uuid">7af3a9ff-9ca1-4bf2-8688-01d7161ba33a</entry>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    </system>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  <os>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  </clock>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.config"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/console.log" append="off"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    </serial>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <video>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 06:58:54 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 06:58:54 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:58:54 np0005546909 nova_compute[187208]: </domain>
Dec  5 06:58:54 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.849 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.815s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.869 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.869 187212 INFO nova.compute.claims [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.896 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.896 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.925 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.926 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.926 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.956 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.957 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.967 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.968 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.972 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.973 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.973 187212 INFO nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Using config drive#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.975 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.978 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.979 187212 DEBUG nova.virt.disk.api [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Checking if we can resize image /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 06:58:54 np0005546909 nova_compute[187208]: 2025-12-05 11:58:54.979 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.013 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.035 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.041 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.043 187212 DEBUG nova.virt.disk.api [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Cannot resize image /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.043 187212 DEBUG nova.objects.instance [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'migration_context' on Instance uuid e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.073 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.073 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Ensure instance console log exists: /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.074 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.074 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.074 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.095 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.140 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.141 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.142 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Creating image(s)#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.142 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "/var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.143 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.143 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.156 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.169 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Automatically allocating a network for project fb2c9c006bee4723bc8dd108e19a6728. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.190 187212 DEBUG nova.compute.provider_tree [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.208 187212 DEBUG nova.scheduler.client.report [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.213 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.213 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.214 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.229 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.248 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.400s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.249 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.252 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.261 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.262 187212 INFO nova.compute.claims [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.290 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.291 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.325 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.326 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.326 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.332 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.332 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.350 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.373 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.379 187212 INFO nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Creating config drive at /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.config#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.384 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaxf_d6jl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.399 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.400 187212 DEBUG nova.virt.disk.api [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Checking if we can resize image /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.400 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.458 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.459 187212 DEBUG nova.virt.disk.api [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Cannot resize image /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.459 187212 DEBUG nova.objects.instance [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'migration_context' on Instance uuid 04518502-62f1-44c3-8c57-b3404958536f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.468 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.469 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.470 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Creating image(s)#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.470 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "/var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.471 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.471 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "/var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.486 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.486 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Ensure instance console log exists: /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.486 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.487 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.487 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.488 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.505 187212 DEBUG oslo_concurrency.processutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpaxf_d6jl" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.548 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.549 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.550 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.561 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 systemd-machined[153543]: New machine qemu-3-instance-00000004.
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.582 187212 DEBUG nova.compute.provider_tree [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:58:55 np0005546909 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.599 187212 DEBUG nova.scheduler.client.report [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.617 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.618 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.639 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.640 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.663 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.664 187212 DEBUG nova.network.neutron [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.676 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.676 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.677 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.692 187212 INFO nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.716 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Automatically allocating a network for project fb2c9c006bee4723bc8dd108e19a6728. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.721 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.728 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.729 187212 DEBUG nova.virt.disk.api [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Checking if we can resize image /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.730 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.799 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.800 187212 DEBUG nova.virt.disk.api [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Cannot resize image /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.800 187212 DEBUG nova.objects.instance [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'migration_context' on Instance uuid b2e8212c-084c-4a4f-b930-56560ae4da12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.823 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.824 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Ensure instance console log exists: /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.825 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.825 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.825 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.827 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.829 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.830 187212 INFO nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Creating image(s)#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.831 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "/var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.831 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "/var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.832 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "/var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.851 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.910 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.911 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.912 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.922 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.941 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Automatically allocating a network for project fb2c9c006bee4723bc8dd108e19a6728. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.983 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:55 np0005546909 nova_compute[187208]: 2025-12-05 11:58:55.984 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.013 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.015 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.016 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.070 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.071 187212 DEBUG nova.virt.disk.api [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Checking if we can resize image /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.072 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.094 187212 DEBUG nova.network.neutron [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.095 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.124 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.125 187212 DEBUG nova.virt.disk.api [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Cannot resize image /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.126 187212 DEBUG nova.objects.instance [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lazy-loading 'migration_context' on Instance uuid d4d2145a-a261-4c1c-82e7-3f595f46aec6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.140 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.140 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Ensure instance console log exists: /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.141 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.141 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.141 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.143 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.147 187212 WARNING nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.151 187212 DEBUG nova.virt.libvirt.host [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.151 187212 DEBUG nova.virt.libvirt.host [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.154 187212 DEBUG nova.virt.libvirt.host [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.154 187212 DEBUG nova.virt.libvirt.host [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.155 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.155 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.156 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.156 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.156 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.157 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.157 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.157 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.157 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.158 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.158 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.158 187212 DEBUG nova.virt.hardware [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.162 187212 DEBUG nova.objects.instance [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lazy-loading 'pci_devices' on Instance uuid d4d2145a-a261-4c1c-82e7-3f595f46aec6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.183 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] End _get_guest_xml xml=<domain type="kvm">
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  <uuid>d4d2145a-a261-4c1c-82e7-3f595f46aec6</uuid>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  <name>instance-00000007</name>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersAdminNegativeTestJSON-server-1874212349</nova:name>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 11:58:56</nova:creationTime>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 06:58:56 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:        <nova:user uuid="7f2d3ae86c634e16a369a01df5a1a50d">tempest-ServersAdminNegativeTestJSON-1418429776-project-member</nova:user>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:        <nova:project uuid="806d12fc57454fa3a9768a9e6ec2b812">tempest-ServersAdminNegativeTestJSON-1418429776</nova:project>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <system>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <entry name="serial">d4d2145a-a261-4c1c-82e7-3f595f46aec6</entry>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <entry name="uuid">d4d2145a-a261-4c1c-82e7-3f595f46aec6</entry>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    </system>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  <os>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  </clock>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.config"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/console.log" append="off"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    </serial>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <video>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 06:58:56 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 06:58:56 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:58:56 np0005546909 nova_compute[187208]: </domain>
Dec  5 06:58:56 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.238 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.239 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.240 187212 INFO nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Using config drive#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.328 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935936.3283365, 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.329 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] VM Resumed (Lifecycle Event)#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.333 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.334 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.338 187212 INFO nova.virt.libvirt.driver [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Instance spawned successfully.#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.339 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 06:58:56 np0005546909 podman[213723]: 2025-12-05 11:58:56.348288148 +0000 UTC m=+0.064547381 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.358 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.366 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.372 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.373 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.373 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.374 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.375 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.376 187212 DEBUG nova.virt.libvirt.driver [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.403 187212 INFO nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Creating config drive at /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.config#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.408 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph1_8zocm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.426 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.427 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935936.331603, 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.428 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] VM Started (Lifecycle Event)#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.433 187212 INFO nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Took 2.02 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.434 187212 DEBUG nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.446 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.449 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.478 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.503 187212 INFO nova.compute.manager [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Took 2.70 seconds to build instance.#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.520 187212 DEBUG oslo_concurrency.lockutils [None req-55fdbf65-7a8e-41e4-b7a3-9eb89364a14f e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:56 np0005546909 nova_compute[187208]: 2025-12-05 11:58:56.535 187212 DEBUG oslo_concurrency.processutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph1_8zocm" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:56 np0005546909 systemd-machined[153543]: New machine qemu-4-instance-00000007.
Dec  5 06:58:56 np0005546909 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.367 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.368 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.368 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935937.3671722, d4d2145a-a261-4c1c-82e7-3f595f46aec6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.369 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] VM Resumed (Lifecycle Event)#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.373 187212 INFO nova.virt.libvirt.driver [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Instance spawned successfully.#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.373 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.465 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.470 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.497 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.498 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935937.3672824, d4d2145a-a261-4c1c-82e7-3f595f46aec6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.498 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] VM Started (Lifecycle Event)#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.504 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.504 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.505 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.506 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.506 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.507 187212 DEBUG nova.virt.libvirt.driver [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.529 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.533 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.556 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.567 187212 INFO nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Took 1.74 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.567 187212 DEBUG nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.593 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "c0320574-7866-4fe3-a513-f678d16b8726" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.594 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c0320574-7866-4fe3-a513-f678d16b8726" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.619 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.630 187212 INFO nova.compute.manager [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Took 2.56 seconds to build instance.#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.648 187212 DEBUG oslo_concurrency.lockutils [None req-2dea8ff3-6a5e-48ca-8b8d-21513a50232a 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.683 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.684 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.689 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.689 187212 INFO nova.compute.claims [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.926 187212 DEBUG nova.compute.provider_tree [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.947 187212 DEBUG nova.scheduler.client.report [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.970 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:57 np0005546909 nova_compute[187208]: 2025-12-05 11:58:57.971 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.021 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.022 187212 DEBUG nova.network.neutron [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.045 187212 INFO nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.063 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.149 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.151 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.151 187212 INFO nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Creating image(s)#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.152 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "/var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.152 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "/var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.153 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "/var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.165 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.220 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.225 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.226 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.244 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.312 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.313 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.428 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk 1073741824" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.430 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.430 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.487 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.489 187212 DEBUG nova.virt.disk.api [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Checking if we can resize image /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.490 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.542 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.543 187212 DEBUG nova.virt.disk.api [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Cannot resize image /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.544 187212 DEBUG nova.objects.instance [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lazy-loading 'migration_context' on Instance uuid c0320574-7866-4fe3-a513-f678d16b8726 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.563 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.563 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Ensure instance console log exists: /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.564 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.564 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.565 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.605 187212 DEBUG nova.network.neutron [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.606 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.607 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.611 187212 WARNING nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.616 187212 DEBUG nova.virt.libvirt.host [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.617 187212 DEBUG nova.virt.libvirt.host [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.621 187212 DEBUG nova.virt.libvirt.host [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.622 187212 DEBUG nova.virt.libvirt.host [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.622 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.623 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.623 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.624 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.624 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.624 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.624 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.625 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.625 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.625 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.626 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.626 187212 DEBUG nova.virt.hardware [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.630 187212 DEBUG nova.objects.instance [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lazy-loading 'pci_devices' on Instance uuid c0320574-7866-4fe3-a513-f678d16b8726 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.644 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] End _get_guest_xml xml=<domain type="kvm">
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  <uuid>c0320574-7866-4fe3-a513-f678d16b8726</uuid>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  <name>instance-00000008</name>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerExternalEventsTest-server-560546370</nova:name>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 11:58:58</nova:creationTime>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 06:58:58 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:        <nova:user uuid="1665784ba58c46f3b4db3ca4aadf4148">tempest-ServerExternalEventsTest-467289434-project-member</nova:user>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:        <nova:project uuid="6543c5ed4cf043ffa1c5da8b358945f0">tempest-ServerExternalEventsTest-467289434</nova:project>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <system>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <entry name="serial">c0320574-7866-4fe3-a513-f678d16b8726</entry>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <entry name="uuid">c0320574-7866-4fe3-a513-f678d16b8726</entry>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    </system>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  <os>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  </clock>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.config"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/console.log" append="off"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    </serial>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <video>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 06:58:58 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 06:58:58 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:58:58 np0005546909 nova_compute[187208]: </domain>
Dec  5 06:58:58 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.700 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.702 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:58:58 np0005546909 nova_compute[187208]: 2025-12-05 11:58:58.702 187212 INFO nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Using config drive#033[00m
Dec  5 06:58:59 np0005546909 nova_compute[187208]: 2025-12-05 11:58:59.157 187212 INFO nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Creating config drive at /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.config#033[00m
Dec  5 06:58:59 np0005546909 nova_compute[187208]: 2025-12-05 11:58:59.162 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp18mzspq0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:58:59 np0005546909 nova_compute[187208]: 2025-12-05 11:58:59.291 187212 DEBUG oslo_concurrency.processutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp18mzspq0" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:58:59 np0005546909 systemd-machined[153543]: New machine qemu-5-instance-00000008.
Dec  5 06:58:59 np0005546909 systemd[1]: Started Virtual Machine qemu-5-instance-00000008.
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.274 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.277 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.278 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935940.2737994, c0320574-7866-4fe3-a513-f678d16b8726 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.278 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] VM Resumed (Lifecycle Event)#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.284 187212 INFO nova.virt.libvirt.driver [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance spawned successfully.#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.284 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.306 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.312 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.315 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.316 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.316 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.317 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.317 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.318 187212 DEBUG nova.virt.libvirt.driver [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.351 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.352 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935940.2750397, c0320574-7866-4fe3-a513-f678d16b8726 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.352 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] VM Started (Lifecycle Event)#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.384 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.386 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.399 187212 INFO nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Took 2.25 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.400 187212 DEBUG nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.408 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.458 187212 INFO nova.compute.manager [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Took 2.79 seconds to build instance.#033[00m
Dec  5 06:59:00 np0005546909 nova_compute[187208]: 2025-12-05 11:59:00.473 187212 DEBUG oslo_concurrency.lockutils [None req-b7eff9c2-4259-4e56-a41a-e083d1a33670 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c0320574-7866-4fe3-a513-f678d16b8726" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.879s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:01 np0005546909 nova_compute[187208]: 2025-12-05 11:59:01.939 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:01 np0005546909 nova_compute[187208]: 2025-12-05 11:59:01.940 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:01 np0005546909 nova_compute[187208]: 2025-12-05 11:59:01.941 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:01 np0005546909 nova_compute[187208]: 2025-12-05 11:59:01.941 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:01 np0005546909 nova_compute[187208]: 2025-12-05 11:59:01.941 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:01 np0005546909 nova_compute[187208]: 2025-12-05 11:59:01.943 187212 INFO nova.compute.manager [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Terminating instance#033[00m
Dec  5 06:59:01 np0005546909 nova_compute[187208]: 2025-12-05 11:59:01.943 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "refresh_cache-7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:01 np0005546909 nova_compute[187208]: 2025-12-05 11:59:01.944 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquired lock "refresh_cache-7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:01 np0005546909 nova_compute[187208]: 2025-12-05 11:59:01.944 187212 DEBUG nova.network.neutron [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.118 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.118 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.133 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.205 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.207 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.214 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.214 187212 INFO nova.compute.claims [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.271 187212 DEBUG nova.network.neutron [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:02 np0005546909 podman[213814]: 2025-12-05 11:59:02.318730592 +0000 UTC m=+0.081351629 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.413 187212 DEBUG nova.compute.provider_tree [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.433 187212 DEBUG nova.scheduler.client.report [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.456 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.457 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.501 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.501 187212 DEBUG nova.network.neutron [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.618 187212 INFO nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 06:59:02 np0005546909 nova_compute[187208]: 2025-12-05 11:59:02.734 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 06:59:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:03.005 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:03.006 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:03.006 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.398 187212 DEBUG nova.compute.manager [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Received event network-changed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.399 187212 DEBUG nova.compute.manager [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Refreshing instance network info cache due to event network-changed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.399 187212 DEBUG oslo_concurrency.lockutils [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] Acquiring lock "refresh_cache-c0320574-7866-4fe3-a513-f678d16b8726" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.399 187212 DEBUG oslo_concurrency.lockutils [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] Acquired lock "refresh_cache-c0320574-7866-4fe3-a513-f678d16b8726" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.399 187212 DEBUG nova.network.neutron [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.486 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764935928.4857826, c9498e91-01c5-47f7-b3ba-6291bb43635d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.487 187212 INFO nova.compute.manager [-] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] VM Stopped (Lifecycle Event)#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.629 187212 DEBUG nova.compute.manager [None req-d877335e-6b07-4608-86b7-e59593134487 - - - - - -] [instance: c9498e91-01c5-47f7-b3ba-6291bb43635d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.650 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.651 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.651 187212 INFO nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Creating image(s)#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.652 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "/var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.652 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "/var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.652 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "/var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.667 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.717 187212 DEBUG nova.network.neutron [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.718 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.718 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.719 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.719 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.731 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.746 187212 DEBUG nova.network.neutron [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.762 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Releasing lock "refresh_cache-7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.763 187212 DEBUG nova.compute.manager [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.793 187212 DEBUG nova.network.neutron [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:03 np0005546909 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Dec  5 06:59:03 np0005546909 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 8.076s CPU time.
Dec  5 06:59:03 np0005546909 systemd-machined[153543]: Machine qemu-3-instance-00000004 terminated.
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.804 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.807 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.839 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.840 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.840 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.896 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.897 187212 DEBUG nova.virt.disk.api [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Checking if we can resize image /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.897 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.953 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.953 187212 DEBUG nova.virt.disk.api [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Cannot resize image /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.954 187212 DEBUG nova.objects.instance [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lazy-loading 'migration_context' on Instance uuid 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.962 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "c0320574-7866-4fe3-a513-f678d16b8726" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.963 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c0320574-7866-4fe3-a513-f678d16b8726" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.963 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "c0320574-7866-4fe3-a513-f678d16b8726-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.963 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c0320574-7866-4fe3-a513-f678d16b8726-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.963 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c0320574-7866-4fe3-a513-f678d16b8726-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.964 187212 INFO nova.compute.manager [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Terminating instance#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.965 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "refresh_cache-c0320574-7866-4fe3-a513-f678d16b8726" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.972 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.972 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Ensure instance console log exists: /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.973 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.974 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.974 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.975 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.984 187212 WARNING nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.992 187212 DEBUG nova.virt.libvirt.host [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.992 187212 DEBUG nova.virt.libvirt.host [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.996 187212 DEBUG nova.virt.libvirt.host [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.996 187212 DEBUG nova.virt.libvirt.host [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.997 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.997 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.997 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.997 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.998 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 06:59:03 np0005546909 nova_compute[187208]: 2025-12-05 11:59:03.999 187212 DEBUG nova.virt.hardware [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.002 187212 DEBUG nova.objects.instance [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.011 187212 INFO nova.virt.libvirt.driver [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Instance destroyed successfully.#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.011 187212 DEBUG nova.objects.instance [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lazy-loading 'resources' on Instance uuid 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.013 187212 DEBUG nova.network.neutron [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.018 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] End _get_guest_xml xml=<domain type="kvm">
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  <uuid>9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70</uuid>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  <name>instance-00000009</name>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersAdminNegativeTestJSON-server-357102285</nova:name>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 11:59:03</nova:creationTime>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 06:59:04 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:        <nova:user uuid="7f2d3ae86c634e16a369a01df5a1a50d">tempest-ServersAdminNegativeTestJSON-1418429776-project-member</nova:user>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:        <nova:project uuid="806d12fc57454fa3a9768a9e6ec2b812">tempest-ServersAdminNegativeTestJSON-1418429776</nova:project>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <system>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <entry name="serial">9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70</entry>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <entry name="uuid">9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70</entry>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    </system>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  <os>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  </clock>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.config"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/console.log" append="off"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    </serial>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <video>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 06:59:04 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 06:59:04 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:59:04 np0005546909 nova_compute[187208]: </domain>
Dec  5 06:59:04 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.025 187212 INFO nova.virt.libvirt.driver [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Deleting instance files /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a_del#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.026 187212 INFO nova.virt.libvirt.driver [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Deletion of /var/lib/nova/instances/7af3a9ff-9ca1-4bf2-8688-01d7161ba33a_del complete#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.029 187212 DEBUG oslo_concurrency.lockutils [None req-dd10bd71-41a3-4534-a3d6-5b810bee1cf8 f9a1dee77cd6473b84f8fdc6fee57d83 720fa4219a454c7c9bd23a6e99972036 - - default default] Releasing lock "refresh_cache-c0320574-7866-4fe3-a513-f678d16b8726" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.029 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquired lock "refresh_cache-c0320574-7866-4fe3-a513-f678d16b8726" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.029 187212 DEBUG nova.network.neutron [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.127 187212 INFO nova.compute.manager [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.127 187212 DEBUG oslo.service.loopingcall [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.127 187212 DEBUG nova.compute.manager [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.128 187212 DEBUG nova.network.neutron [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.131 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.131 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.131 187212 INFO nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Using config drive#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.200 187212 DEBUG nova.network.neutron [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.318 187212 INFO nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Creating config drive at /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.config#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.327 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmporrb0r4b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.461 187212 DEBUG oslo_concurrency.processutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmporrb0r4b" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.485 187212 DEBUG nova.network.neutron [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.504 187212 DEBUG nova.network.neutron [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:04 np0005546909 systemd-machined[153543]: New machine qemu-6-instance-00000009.
Dec  5 06:59:04 np0005546909 systemd[1]: Started Virtual Machine qemu-6-instance-00000009.
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.544 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Releasing lock "refresh_cache-c0320574-7866-4fe3-a513-f678d16b8726" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.544 187212 DEBUG nova.compute.manager [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.545 187212 DEBUG nova.network.neutron [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.562 187212 INFO nova.compute.manager [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Took 0.43 seconds to deallocate network for instance.#033[00m
Dec  5 06:59:04 np0005546909 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.609 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.610 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:04 np0005546909 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000008.scope: Consumed 5.252s CPU time.
Dec  5 06:59:04 np0005546909 systemd-machined[153543]: Machine qemu-5-instance-00000008 terminated.
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.784 187212 DEBUG nova.compute.provider_tree [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.800 187212 INFO nova.virt.libvirt.driver [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance destroyed successfully.#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.801 187212 DEBUG nova.objects.instance [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lazy-loading 'resources' on Instance uuid c0320574-7866-4fe3-a513-f678d16b8726 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.803 187212 DEBUG nova.scheduler.client.report [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.824 187212 INFO nova.virt.libvirt.driver [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Deleting instance files /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726_del#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.825 187212 INFO nova.virt.libvirt.driver [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Deletion of /var/lib/nova/instances/c0320574-7866-4fe3-a513-f678d16b8726_del complete#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.830 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.851 187212 INFO nova.scheduler.client.report [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Deleted allocations for instance 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.879 187212 INFO nova.compute.manager [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.879 187212 DEBUG oslo.service.loopingcall [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.880 187212 DEBUG nova.compute.manager [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.880 187212 DEBUG nova.network.neutron [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 06:59:04 np0005546909 nova_compute[187208]: 2025-12-05 11:59:04.918 187212 DEBUG oslo_concurrency.lockutils [None req-a52790ad-fbf0-4298-abe2-b18e5ae6fb06 e2dbb72c61fa4cdfa0de840c11264065 43e0982f67c94ddb8da10556d22f6e39 - - default default] Lock "7af3a9ff-9ca1-4bf2-8688-01d7161ba33a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.978s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.297 187212 DEBUG nova.network.neutron [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.314 187212 DEBUG nova.network.neutron [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.327 187212 INFO nova.compute.manager [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Took 0.45 seconds to deallocate network for instance.#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.378 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.379 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.537 187212 DEBUG nova.compute.provider_tree [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.555 187212 DEBUG nova.scheduler.client.report [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.581 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.591 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935945.5915148, 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.593 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] VM Resumed (Lifecycle Event)#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.595 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.595 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.600 187212 INFO nova.virt.libvirt.driver [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Instance spawned successfully.#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.600 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.604 187212 INFO nova.scheduler.client.report [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Deleted allocations for instance c0320574-7866-4fe3-a513-f678d16b8726#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.610 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.613 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.622 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.623 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.623 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.624 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.624 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.624 187212 DEBUG nova.virt.libvirt.driver [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.653 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.654 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935945.596224, 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.654 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] VM Started (Lifecycle Event)#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.686 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.689 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.693 187212 DEBUG oslo_concurrency.lockutils [None req-2074655c-1dbf-4093-a0fc-47501bb89e32 1665784ba58c46f3b4db3ca4aadf4148 6543c5ed4cf043ffa1c5da8b358945f0 - - default default] Lock "c0320574-7866-4fe3-a513-f678d16b8726" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.697 187212 INFO nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Took 2.05 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.697 187212 DEBUG nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.708 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.755 187212 INFO nova.compute.manager [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Took 3.57 seconds to build instance.#033[00m
Dec  5 06:59:05 np0005546909 nova_compute[187208]: 2025-12-05 11:59:05.772 187212 DEBUG oslo_concurrency.lockutils [None req-0227e56b-1121-4a07-a57c-aa18d459a719 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:08 np0005546909 nova_compute[187208]: 2025-12-05 11:59:08.413 187212 DEBUG nova.objects.instance [None req-4419b7ac-3943-4fe3-a54d-9365fc4a978c 578057bdefa3459f8e1d51e5b47d9030 dc0ef83f10594fbe978d85c5f4210cfe - - default default] Lazy-loading 'pci_devices' on Instance uuid 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:08 np0005546909 nova_compute[187208]: 2025-12-05 11:59:08.432 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935948.432069, 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:08 np0005546909 nova_compute[187208]: 2025-12-05 11:59:08.432 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] VM Paused (Lifecycle Event)#033[00m
Dec  5 06:59:08 np0005546909 nova_compute[187208]: 2025-12-05 11:59:08.449 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:08 np0005546909 nova_compute[187208]: 2025-12-05 11:59:08.453 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:08 np0005546909 nova_compute[187208]: 2025-12-05 11:59:08.483 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Dec  5 06:59:09 np0005546909 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000009.scope: Deactivated successfully.
Dec  5 06:59:09 np0005546909 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000009.scope: Consumed 4.102s CPU time.
Dec  5 06:59:09 np0005546909 systemd-machined[153543]: Machine qemu-6-instance-00000009 terminated.
Dec  5 06:59:09 np0005546909 nova_compute[187208]: 2025-12-05 11:59:09.402 187212 DEBUG nova.compute.manager [None req-4419b7ac-3943-4fe3-a54d-9365fc4a978c 578057bdefa3459f8e1d51e5b47d9030 dc0ef83f10594fbe978d85c5f4210cfe - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:11 np0005546909 podman[213919]: 2025-12-05 11:59:11.242308068 +0000 UTC m=+0.079611440 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec  5 06:59:12 np0005546909 nova_compute[187208]: 2025-12-05 11:59:11.999 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:12 np0005546909 nova_compute[187208]: 2025-12-05 11:59:12.000 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:12 np0005546909 nova_compute[187208]: 2025-12-05 11:59:12.000 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:12 np0005546909 nova_compute[187208]: 2025-12-05 11:59:12.000 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:12 np0005546909 nova_compute[187208]: 2025-12-05 11:59:12.000 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:12 np0005546909 nova_compute[187208]: 2025-12-05 11:59:12.001 187212 INFO nova.compute.manager [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Terminating instance#033[00m
Dec  5 06:59:12 np0005546909 nova_compute[187208]: 2025-12-05 11:59:12.002 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "refresh_cache-9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:12 np0005546909 nova_compute[187208]: 2025-12-05 11:59:12.002 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquired lock "refresh_cache-9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:12 np0005546909 nova_compute[187208]: 2025-12-05 11:59:12.002 187212 DEBUG nova.network.neutron [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 06:59:12 np0005546909 nova_compute[187208]: 2025-12-05 11:59:12.537 187212 DEBUG nova.network.neutron [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:13 np0005546909 nova_compute[187208]: 2025-12-05 11:59:13.638 187212 DEBUG nova.network.neutron [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:13 np0005546909 nova_compute[187208]: 2025-12-05 11:59:13.655 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Releasing lock "refresh_cache-9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:13 np0005546909 nova_compute[187208]: 2025-12-05 11:59:13.656 187212 DEBUG nova.compute.manager [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 06:59:13 np0005546909 nova_compute[187208]: 2025-12-05 11:59:13.663 187212 INFO nova.virt.libvirt.driver [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Instance destroyed successfully.#033[00m
Dec  5 06:59:13 np0005546909 nova_compute[187208]: 2025-12-05 11:59:13.663 187212 DEBUG nova.objects.instance [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lazy-loading 'resources' on Instance uuid 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:13 np0005546909 nova_compute[187208]: 2025-12-05 11:59:13.682 187212 INFO nova.virt.libvirt.driver [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Deleting instance files /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70_del#033[00m
Dec  5 06:59:13 np0005546909 nova_compute[187208]: 2025-12-05 11:59:13.683 187212 INFO nova.virt.libvirt.driver [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Deletion of /var/lib/nova/instances/9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70_del complete#033[00m
Dec  5 06:59:13 np0005546909 nova_compute[187208]: 2025-12-05 11:59:13.738 187212 INFO nova.compute.manager [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Took 0.08 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 06:59:13 np0005546909 nova_compute[187208]: 2025-12-05 11:59:13.739 187212 DEBUG oslo.service.loopingcall [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 06:59:13 np0005546909 nova_compute[187208]: 2025-12-05 11:59:13.739 187212 DEBUG nova.compute.manager [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 06:59:13 np0005546909 nova_compute[187208]: 2025-12-05 11:59:13.739 187212 DEBUG nova.network.neutron [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 06:59:14 np0005546909 nova_compute[187208]: 2025-12-05 11:59:14.251 187212 DEBUG nova.network.neutron [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:14 np0005546909 nova_compute[187208]: 2025-12-05 11:59:14.277 187212 DEBUG nova.network.neutron [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:14 np0005546909 nova_compute[187208]: 2025-12-05 11:59:14.299 187212 INFO nova.compute.manager [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Took 0.56 seconds to deallocate network for instance.#033[00m
Dec  5 06:59:14 np0005546909 nova_compute[187208]: 2025-12-05 11:59:14.367 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:14 np0005546909 nova_compute[187208]: 2025-12-05 11:59:14.367 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:14 np0005546909 nova_compute[187208]: 2025-12-05 11:59:14.744 187212 DEBUG nova.compute.provider_tree [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:59:14 np0005546909 nova_compute[187208]: 2025-12-05 11:59:14.771 187212 DEBUG nova.scheduler.client.report [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:59:14 np0005546909 nova_compute[187208]: 2025-12-05 11:59:14.802 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.435s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:14 np0005546909 nova_compute[187208]: 2025-12-05 11:59:14.840 187212 INFO nova.scheduler.client.report [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Deleted allocations for instance 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70#033[00m
Dec  5 06:59:14 np0005546909 nova_compute[187208]: 2025-12-05 11:59:14.916 187212 DEBUG oslo_concurrency.lockutils [None req-57e71372-d30c-4137-8f23-1e6859b5b0ed 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.916s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:15 np0005546909 podman[213939]: 2025-12-05 11:59:15.230154111 +0000 UTC m=+0.070030168 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 06:59:15 np0005546909 nova_compute[187208]: 2025-12-05 11:59:15.821 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:15 np0005546909 nova_compute[187208]: 2025-12-05 11:59:15.822 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:15 np0005546909 nova_compute[187208]: 2025-12-05 11:59:15.822 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:15 np0005546909 nova_compute[187208]: 2025-12-05 11:59:15.822 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:15 np0005546909 nova_compute[187208]: 2025-12-05 11:59:15.823 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:15 np0005546909 nova_compute[187208]: 2025-12-05 11:59:15.823 187212 INFO nova.compute.manager [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Terminating instance#033[00m
Dec  5 06:59:15 np0005546909 nova_compute[187208]: 2025-12-05 11:59:15.824 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "refresh_cache-d4d2145a-a261-4c1c-82e7-3f595f46aec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:15 np0005546909 nova_compute[187208]: 2025-12-05 11:59:15.824 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquired lock "refresh_cache-d4d2145a-a261-4c1c-82e7-3f595f46aec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:15 np0005546909 nova_compute[187208]: 2025-12-05 11:59:15.825 187212 DEBUG nova.network.neutron [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 06:59:16 np0005546909 nova_compute[187208]: 2025-12-05 11:59:16.637 187212 DEBUG nova.network.neutron [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:17 np0005546909 podman[213959]: 2025-12-05 11:59:17.217804639 +0000 UTC m=+0.070848810 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, vcs-type=git, distribution-scope=public, release=1755695350, config_id=edpm, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec  5 06:59:18 np0005546909 nova_compute[187208]: 2025-12-05 11:59:18.040 187212 DEBUG nova.network.neutron [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:18 np0005546909 nova_compute[187208]: 2025-12-05 11:59:18.055 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Releasing lock "refresh_cache-d4d2145a-a261-4c1c-82e7-3f595f46aec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:18 np0005546909 nova_compute[187208]: 2025-12-05 11:59:18.056 187212 DEBUG nova.compute.manager [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 06:59:18 np0005546909 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec  5 06:59:18 np0005546909 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 12.852s CPU time.
Dec  5 06:59:18 np0005546909 systemd-machined[153543]: Machine qemu-4-instance-00000007 terminated.
Dec  5 06:59:18 np0005546909 nova_compute[187208]: 2025-12-05 11:59:18.299 187212 INFO nova.virt.libvirt.driver [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Instance destroyed successfully.#033[00m
Dec  5 06:59:18 np0005546909 nova_compute[187208]: 2025-12-05 11:59:18.301 187212 DEBUG nova.objects.instance [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lazy-loading 'resources' on Instance uuid d4d2145a-a261-4c1c-82e7-3f595f46aec6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:18 np0005546909 nova_compute[187208]: 2025-12-05 11:59:18.318 187212 INFO nova.virt.libvirt.driver [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Deleting instance files /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6_del#033[00m
Dec  5 06:59:18 np0005546909 nova_compute[187208]: 2025-12-05 11:59:18.319 187212 INFO nova.virt.libvirt.driver [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Deletion of /var/lib/nova/instances/d4d2145a-a261-4c1c-82e7-3f595f46aec6_del complete#033[00m
Dec  5 06:59:18 np0005546909 nova_compute[187208]: 2025-12-05 11:59:18.392 187212 INFO nova.compute.manager [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 06:59:18 np0005546909 nova_compute[187208]: 2025-12-05 11:59:18.392 187212 DEBUG oslo.service.loopingcall [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 06:59:18 np0005546909 nova_compute[187208]: 2025-12-05 11:59:18.393 187212 DEBUG nova.compute.manager [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 06:59:18 np0005546909 nova_compute[187208]: 2025-12-05 11:59:18.393 187212 DEBUG nova.network.neutron [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 06:59:18 np0005546909 nova_compute[187208]: 2025-12-05 11:59:18.994 187212 DEBUG nova.network.neutron [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.009 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764935944.0075085, 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.009 187212 INFO nova.compute.manager [-] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] VM Stopped (Lifecycle Event)#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.011 187212 DEBUG nova.network.neutron [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.047 187212 DEBUG nova.compute.manager [None req-93a90937-dc92-46a3-9970-aafcf36afaed - - - - - -] [instance: 7af3a9ff-9ca1-4bf2-8688-01d7161ba33a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.049 187212 INFO nova.compute.manager [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Took 0.66 seconds to deallocate network for instance.#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.143 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.143 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.326 187212 DEBUG nova.compute.provider_tree [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.348 187212 DEBUG nova.scheduler.client.report [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.381 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.405 187212 INFO nova.scheduler.client.report [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Deleted allocations for instance d4d2145a-a261-4c1c-82e7-3f595f46aec6#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.481 187212 DEBUG oslo_concurrency.lockutils [None req-6a415111-ac60-4020-a75b-0476e3ad5d60 7f2d3ae86c634e16a369a01df5a1a50d 806d12fc57454fa3a9768a9e6ec2b812 - - default default] Lock "d4d2145a-a261-4c1c-82e7-3f595f46aec6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.799 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764935944.7983358, c0320574-7866-4fe3-a513-f678d16b8726 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.799 187212 INFO nova.compute.manager [-] [instance: c0320574-7866-4fe3-a513-f678d16b8726] VM Stopped (Lifecycle Event)#033[00m
Dec  5 06:59:19 np0005546909 nova_compute[187208]: 2025-12-05 11:59:19.833 187212 DEBUG nova.compute.manager [None req-a89ff6b4-1b8d-4e72-ac3a-c247ff80a93b - - - - - -] [instance: c0320574-7866-4fe3-a513-f678d16b8726] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:20 np0005546909 nova_compute[187208]: 2025-12-05 11:59:20.342 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Automatically allocated network: {'id': 'ca5a0748-2268-4f31-a673-9ef2606c4273', 'name': 'auto_allocated_network', 'tenant_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['3601d1a0-6de8-4dc7-839c-f1b2d1901b80', 'c438a52b-4019-49b5-8423-28f69ccbec64'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-12-05T11:58:56Z', 'updated_at': '2025-12-05T11:59:18Z', 'revision_number': 4, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Dec  5 06:59:20 np0005546909 nova_compute[187208]: 2025-12-05 11:59:20.495 187212 WARNING oslo_policy.policy [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec  5 06:59:20 np0005546909 nova_compute[187208]: 2025-12-05 11:59:20.496 187212 WARNING oslo_policy.policy [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec  5 06:59:20 np0005546909 nova_compute[187208]: 2025-12-05 11:59:20.498 187212 DEBUG nova.policy [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 06:59:22 np0005546909 podman[213990]: 2025-12-05 11:59:22.220880743 +0000 UTC m=+0.062350326 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec  5 06:59:22 np0005546909 podman[213991]: 2025-12-05 11:59:22.273512161 +0000 UTC m=+0.104174027 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  5 06:59:24 np0005546909 nova_compute[187208]: 2025-12-05 11:59:24.405 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764935949.4035456, 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:24 np0005546909 nova_compute[187208]: 2025-12-05 11:59:24.405 187212 INFO nova.compute.manager [-] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] VM Stopped (Lifecycle Event)#033[00m
Dec  5 06:59:24 np0005546909 nova_compute[187208]: 2025-12-05 11:59:24.438 187212 DEBUG nova.compute.manager [None req-c562e51b-c98e-4abe-ab33-402fae500899 - - - - - -] [instance: 9d6c83f2-2140-44fa-a4be-cd9ba2ee6b70] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:25 np0005546909 nova_compute[187208]: 2025-12-05 11:59:25.607 187212 DEBUG oslo_concurrency.processutils [None req-26862be7-1bd7-40f8-956d-7c5def84be87 7db28c1b5e8344a1931b5e793e16923f 0d0e512e95bb4c6787173c66c924ae20 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:25 np0005546909 nova_compute[187208]: 2025-12-05 11:59:25.649 187212 DEBUG oslo_concurrency.processutils [None req-26862be7-1bd7-40f8-956d-7c5def84be87 7db28c1b5e8344a1931b5e793e16923f 0d0e512e95bb4c6787173c66c924ae20 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:26 np0005546909 nova_compute[187208]: 2025-12-05 11:59:26.014 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Successfully created port: a5ad03eb-1959-4b2d-a437-979506e6b988 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 06:59:27 np0005546909 podman[214040]: 2025-12-05 11:59:27.259371502 +0000 UTC m=+0.092530156 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 06:59:31 np0005546909 nova_compute[187208]: 2025-12-05 11:59:31.113 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Successfully updated port: a5ad03eb-1959-4b2d-a437-979506e6b988 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 06:59:31 np0005546909 nova_compute[187208]: 2025-12-05 11:59:31.138 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "refresh_cache-e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:31 np0005546909 nova_compute[187208]: 2025-12-05 11:59:31.139 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquired lock "refresh_cache-e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:31 np0005546909 nova_compute[187208]: 2025-12-05 11:59:31.139 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 06:59:31 np0005546909 nova_compute[187208]: 2025-12-05 11:59:31.533 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:32 np0005546909 nova_compute[187208]: 2025-12-05 11:59:32.855 187212 DEBUG nova.compute.manager [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Received event network-changed-a5ad03eb-1959-4b2d-a437-979506e6b988 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 06:59:32 np0005546909 nova_compute[187208]: 2025-12-05 11:59:32.856 187212 DEBUG nova.compute.manager [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Refreshing instance network info cache due to event network-changed-a5ad03eb-1959-4b2d-a437-979506e6b988. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 06:59:32 np0005546909 nova_compute[187208]: 2025-12-05 11:59:32.856 187212 DEBUG oslo_concurrency.lockutils [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.107 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.107 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.108 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 06:59:33 np0005546909 podman[214066]: 2025-12-05 11:59:33.211106621 +0000 UTC m=+0.064222418 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.297 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764935958.296861, d4d2145a-a261-4c1c-82e7-3f595f46aec6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.298 187212 INFO nova.compute.manager [-] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] VM Stopped (Lifecycle Event)#033[00m
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.320 187212 DEBUG nova.compute.manager [None req-db1724b9-21dd-4ed2-83e1-a04e9f2ff516 - - - - - -] [instance: d4d2145a-a261-4c1c-82e7-3f595f46aec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.456 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.456 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.456 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 06:59:33 np0005546909 nova_compute[187208]: 2025-12-05 11:59:33.456 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid caa6c7c3-7eb3-4636-a7ad-7b605ef393ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.654 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.911 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Updating instance_info_cache with network_info: [{"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:34.945 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 06:59:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:34.948 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.959 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Releasing lock "refresh_cache-e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.960 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Instance network_info: |[{"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.960 187212 DEBUG oslo_concurrency.lockutils [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.961 187212 DEBUG nova.network.neutron [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Refreshing network info cache for port a5ad03eb-1959-4b2d-a437-979506e6b988 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.964 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Start _get_guest_xml network_info=[{"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.971 187212 WARNING nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.987 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.988 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.997 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.998 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 06:59:34 np0005546909 nova_compute[187208]: 2025-12-05 11:59:34.999 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.000 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.000 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.001 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.001 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.001 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.002 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.002 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.003 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.003 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.004 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.004 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.009 187212 DEBUG nova.virt.libvirt.vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-1',id=3,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:58:54Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=e83b5d7d-04a7-44d9-a6fe-580f1cfa5838,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.010 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.011 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.013 187212 DEBUG nova.objects.instance [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'pci_devices' on Instance uuid e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.037 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] End _get_guest_xml xml=<domain type="kvm">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  <uuid>e83b5d7d-04a7-44d9-a6fe-580f1cfa5838</uuid>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  <name>instance-00000003</name>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <nova:name>tempest-tempest.common.compute-instance-445293436-1</nova:name>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 11:59:34</nova:creationTime>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:        <nova:user uuid="c4c62f22ba09455995ea1bde6a93431e">tempest-AutoAllocateNetworkTest-275048159-project-member</nova:user>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:        <nova:project uuid="fb2c9c006bee4723bc8dd108e19a6728">tempest-AutoAllocateNetworkTest-275048159</nova:project>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:        <nova:port uuid="a5ad03eb-1959-4b2d-a437-979506e6b988">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.1.0.55" ipVersion="4"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="fdfe:381f:8400::38b" ipVersion="6"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <system>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <entry name="serial">e83b5d7d-04a7-44d9-a6fe-580f1cfa5838</entry>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <entry name="uuid">e83b5d7d-04a7-44d9-a6fe-580f1cfa5838</entry>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    </system>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  <os>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  </clock>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.config"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:2b:76:46"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <target dev="tapa5ad03eb-19"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    </interface>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/console.log" append="off"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    </serial>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <video>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 06:59:35 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 06:59:35 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:59:35 np0005546909 nova_compute[187208]: </domain>
Dec  5 06:59:35 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.039 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Preparing to wait for external event network-vif-plugged-a5ad03eb-1959-4b2d-a437-979506e6b988 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.039 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.040 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.040 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.041 187212 DEBUG nova.virt.libvirt.vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-1',id=3,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:58:54Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=e83b5d7d-04a7-44d9-a6fe-580f1cfa5838,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.042 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.043 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.043 187212 DEBUG os_vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.081 187212 DEBUG ovsdbapp.backend.ovs_idl [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.081 187212 DEBUG ovsdbapp.backend.ovs_idl [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.082 187212 DEBUG ovsdbapp.backend.ovs_idl [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.082 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.082 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [POLLOUT] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.082 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.083 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.084 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.086 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.093 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.093 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.094 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.095 187212 INFO oslo.privsep.daemon [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpq2dvfmb4/privsep.sock']#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.827 187212 INFO oslo.privsep.daemon [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.700 214095 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.705 214095 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.707 214095 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Dec  5 06:59:35 np0005546909 nova_compute[187208]: 2025-12-05 11:59:35.707 214095 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214095#033[00m
Dec  5 06:59:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:35.951 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.128 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.129 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa5ad03eb-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.129 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa5ad03eb-19, col_values=(('external_ids', {'iface-id': 'a5ad03eb-1959-4b2d-a437-979506e6b988', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:76:46', 'vm-uuid': 'e83b5d7d-04a7-44d9-a6fe-580f1cfa5838'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.131 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:36 np0005546909 NetworkManager[55691]: <info>  [1764935976.1323] manager: (tapa5ad03eb-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.134 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.137 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.138 187212 INFO os_vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19')#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.144 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.160 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.160 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.161 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.162 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.162 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.163 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.163 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.188 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.188 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.188 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.189 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.346 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.346 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.346 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No VIF found with MAC fa:16:3e:2b:76:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.347 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Using config drive#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.377 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.462 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.463 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.520 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.528 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.603 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.604 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.662 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.663 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000003, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.config'#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.815 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.816 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5724MB free_disk=73.31016540527344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.817 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.817 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.921 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance caa6c7c3-7eb3-4636-a7ad-7b605ef393ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.922 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.922 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 04518502-62f1-44c3-8c57-b3404958536f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.922 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance b2e8212c-084c-4a4f-b930-56560ae4da12 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.922 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 06:59:36 np0005546909 nova_compute[187208]: 2025-12-05 11:59:36.922 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 06:59:37 np0005546909 nova_compute[187208]: 2025-12-05 11:59:37.013 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:59:37 np0005546909 nova_compute[187208]: 2025-12-05 11:59:37.026 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:59:37 np0005546909 nova_compute[187208]: 2025-12-05 11:59:37.050 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 06:59:37 np0005546909 nova_compute[187208]: 2025-12-05 11:59:37.050 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:37 np0005546909 nova_compute[187208]: 2025-12-05 11:59:37.165 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:37 np0005546909 nova_compute[187208]: 2025-12-05 11:59:37.947 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:59:37 np0005546909 nova_compute[187208]: 2025-12-05 11:59:37.948 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:59:37 np0005546909 nova_compute[187208]: 2025-12-05 11:59:37.979 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:59:37 np0005546909 nova_compute[187208]: 2025-12-05 11:59:37.979 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 06:59:38 np0005546909 nova_compute[187208]: 2025-12-05 11:59:38.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 06:59:38 np0005546909 nova_compute[187208]: 2025-12-05 11:59:38.748 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Automatically allocated network: {'id': 'ca5a0748-2268-4f31-a673-9ef2606c4273', 'name': 'auto_allocated_network', 'tenant_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['3601d1a0-6de8-4dc7-839c-f1b2d1901b80', 'c438a52b-4019-49b5-8423-28f69ccbec64'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-12-05T11:58:56Z', 'updated_at': '2025-12-05T11:59:18Z', 'revision_number': 4, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Dec  5 06:59:38 np0005546909 nova_compute[187208]: 2025-12-05 11:59:38.749 187212 DEBUG nova.policy [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 06:59:39 np0005546909 nova_compute[187208]: 2025-12-05 11:59:39.154 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Creating config drive at /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.config#033[00m
Dec  5 06:59:39 np0005546909 nova_compute[187208]: 2025-12-05 11:59:39.160 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxwg12it9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:39 np0005546909 nova_compute[187208]: 2025-12-05 11:59:39.285 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxwg12it9" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:39 np0005546909 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec  5 06:59:39 np0005546909 kernel: tapa5ad03eb-19: entered promiscuous mode
Dec  5 06:59:39 np0005546909 NetworkManager[55691]: <info>  [1764935979.3566] manager: (tapa5ad03eb-19): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Dec  5 06:59:39 np0005546909 nova_compute[187208]: 2025-12-05 11:59:39.402 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:39 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:39Z|00033|binding|INFO|Claiming lport a5ad03eb-1959-4b2d-a437-979506e6b988 for this chassis.
Dec  5 06:59:39 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:39Z|00034|binding|INFO|a5ad03eb-1959-4b2d-a437-979506e6b988: Claiming fa:16:3e:2b:76:46 10.1.0.55 fdfe:381f:8400::38b
Dec  5 06:59:39 np0005546909 systemd-udevd[214132]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:59:39 np0005546909 NetworkManager[55691]: <info>  [1764935979.4220] device (tapa5ad03eb-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:59:39 np0005546909 NetworkManager[55691]: <info>  [1764935979.4227] device (tapa5ad03eb-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 06:59:39 np0005546909 nova_compute[187208]: 2025-12-05 11:59:39.484 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:39 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:39Z|00035|binding|INFO|Setting lport a5ad03eb-1959-4b2d-a437-979506e6b988 ovn-installed in OVS
Dec  5 06:59:39 np0005546909 nova_compute[187208]: 2025-12-05 11:59:39.490 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:39 np0005546909 systemd-machined[153543]: New machine qemu-7-instance-00000003.
Dec  5 06:59:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:39.527 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:76:46 10.1.0.55 fdfe:381f:8400::38b'], port_security=['fa:16:3e:2b:76:46 10.1.0.55 fdfe:381f:8400::38b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.55/26 fdfe:381f:8400::38b/64', 'neutron:device_id': 'e83b5d7d-04a7-44d9-a6fe-580f1cfa5838', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca5a0748-2268-4f31-a673-9ef2606c4273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d28d43c-0f17-4a95-87c9-620fe47e764a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a93d84c4-2884-48aa-b436-9baea579d840, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=a5ad03eb-1959-4b2d-a437-979506e6b988) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 06:59:39 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:39Z|00036|binding|INFO|Setting lport a5ad03eb-1959-4b2d-a437-979506e6b988 up in Southbound
Dec  5 06:59:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:39.528 104471 INFO neutron.agent.ovn.metadata.agent [-] Port a5ad03eb-1959-4b2d-a437-979506e6b988 in datapath ca5a0748-2268-4f31-a673-9ef2606c4273 bound to our chassis#033[00m
Dec  5 06:59:39 np0005546909 systemd[1]: Started Virtual Machine qemu-7-instance-00000003.
Dec  5 06:59:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:39.530 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca5a0748-2268-4f31-a673-9ef2606c4273#033[00m
Dec  5 06:59:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:39.532 104471 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpppfurjfv/privsep.sock']#033[00m
Dec  5 06:59:39 np0005546909 nova_compute[187208]: 2025-12-05 11:59:39.992 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935979.991877, e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:39 np0005546909 nova_compute[187208]: 2025-12-05 11:59:39.993 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] VM Started (Lifecycle Event)#033[00m
Dec  5 06:59:40 np0005546909 nova_compute[187208]: 2025-12-05 11:59:40.011 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:40 np0005546909 nova_compute[187208]: 2025-12-05 11:59:40.015 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935979.992355, e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:40 np0005546909 nova_compute[187208]: 2025-12-05 11:59:40.015 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] VM Paused (Lifecycle Event)#033[00m
Dec  5 06:59:40 np0005546909 nova_compute[187208]: 2025-12-05 11:59:40.038 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:40 np0005546909 nova_compute[187208]: 2025-12-05 11:59:40.040 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:40 np0005546909 nova_compute[187208]: 2025-12-05 11:59:40.061 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.208 104471 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  5 06:59:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.209 104471 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpppfurjfv/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  5 06:59:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.086 214158 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  5 06:59:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.090 214158 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  5 06:59:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.091 214158 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Dec  5 06:59:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.092 214158 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214158#033[00m
Dec  5 06:59:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.213 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc3d578-a5db-4d98-8404-fee48b08ddd9]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.799 214158 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.800 214158 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:40.800 214158 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:41 np0005546909 nova_compute[187208]: 2025-12-05 11:59:41.132 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:41 np0005546909 nova_compute[187208]: 2025-12-05 11:59:41.225 187212 DEBUG nova.network.neutron [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Updated VIF entry in instance network info cache for port a5ad03eb-1959-4b2d-a437-979506e6b988. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 06:59:41 np0005546909 nova_compute[187208]: 2025-12-05 11:59:41.225 187212 DEBUG nova.network.neutron [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Updating instance_info_cache with network_info: [{"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:41 np0005546909 nova_compute[187208]: 2025-12-05 11:59:41.290 187212 DEBUG oslo_concurrency.lockutils [req-fe1186f2-9bde-48b7-a05a-3aad18742296 req-d7f6e164-a612-4f80-938d-6c8d0bb562c2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.382 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4655addf-07ee-41bf-925c-a4d94b663704]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.384 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapca5a0748-21 in ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 06:59:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.386 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapca5a0748-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 06:59:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.386 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3c3bac0a-4bc8-4f78-898e-37bc1d732cc9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.389 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3d20a50b-b439-4ae6-be62-fd0838ec19b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.409 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[19e1ab11-b796-4f1d-8273-bf5ad4d08955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.421 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[182f53da-6cfa-47d9-8807-5570c8de5a3f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.424 104471 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp1rim3niy/privsep.sock']#033[00m
Dec  5 06:59:41 np0005546909 podman[214167]: 2025-12-05 11:59:41.509118522 +0000 UTC m=+0.077455821 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 06:59:41 np0005546909 nova_compute[187208]: 2025-12-05 11:59:41.536 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Automatically allocated network: {'id': 'ca5a0748-2268-4f31-a673-9ef2606c4273', 'name': 'auto_allocated_network', 'tenant_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['3601d1a0-6de8-4dc7-839c-f1b2d1901b80', 'c438a52b-4019-49b5-8423-28f69ccbec64'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-12-05T11:58:56Z', 'updated_at': '2025-12-05T11:59:18Z', 'revision_number': 4, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Dec  5 06:59:41 np0005546909 nova_compute[187208]: 2025-12-05 11:59:41.537 187212 DEBUG nova.policy [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 06:59:41 np0005546909 nova_compute[187208]: 2025-12-05 11:59:41.578 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Successfully created port: 0d1b5558-6557-43e9-8cac-a00b4e97ea8b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 06:59:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.132 104471 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  5 06:59:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.133 104471 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp1rim3niy/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  5 06:59:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.994 214193 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  5 06:59:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:41.998 214193 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  5 06:59:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.000 214193 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec  5 06:59:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.000 214193 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214193#033[00m
Dec  5 06:59:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.136 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[668690a6-dc49-42b1-b03a-ff3ee1d69622]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:42 np0005546909 nova_compute[187208]: 2025-12-05 11:59:42.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.620 214193 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.620 214193 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:42.621 214193 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.221 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2345a418-2b50-4221-9625-1a90a97d22c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.247 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1273ab9b-a534-4039-9649-16d8fed359fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:43 np0005546909 NetworkManager[55691]: <info>  [1764935983.2488] manager: (tapca5a0748-20): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Dec  5 06:59:43 np0005546909 nova_compute[187208]: 2025-12-05 11:59:43.251 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Successfully created port: 06886ab7-aa74-4f44-b509-94e27d585818 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 06:59:43 np0005546909 systemd-udevd[214205]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.284 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9e041ebd-9d8c-4aab-a491-2b3742ad8997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.287 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f089ad38-4e49-49e2-8d34-ab49c9781893]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:43 np0005546909 NetworkManager[55691]: <info>  [1764935983.3154] device (tapca5a0748-20): carrier: link connected
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.324 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0c306d05-7b09-4b6f-a1ba-ac723a6f2889]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.342 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0cbb79b2-6538-41f0-9a11-5c2b910f27c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca5a0748-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:49:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336387, 'reachable_time': 27307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214223, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.358 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f55174c6-bf97-4ab9-bada-49425e51a430]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed9:49b1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336387, 'tstamp': 336387}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214224, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.375 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2fd310d7-e336-44ed-8781-acfde9972292]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca5a0748-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:49:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336387, 'reachable_time': 27307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214225, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.410 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5dabe1dc-c799-40ca-aefd-b08319a23390]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.472 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fb6daa22-4bf7-463e-bff6-b7b6fe24729a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.474 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca5a0748-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.474 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.475 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca5a0748-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:43 np0005546909 nova_compute[187208]: 2025-12-05 11:59:43.476 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:43 np0005546909 NetworkManager[55691]: <info>  [1764935983.4772] manager: (tapca5a0748-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Dec  5 06:59:43 np0005546909 kernel: tapca5a0748-20: entered promiscuous mode
Dec  5 06:59:43 np0005546909 nova_compute[187208]: 2025-12-05 11:59:43.479 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.481 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca5a0748-20, col_values=(('external_ids', {'iface-id': '4248cb8a-d980-4682-8c47-d6faac0a26bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:43 np0005546909 nova_compute[187208]: 2025-12-05 11:59:43.482 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:43 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:43Z|00037|binding|INFO|Releasing lport 4248cb8a-d980-4682-8c47-d6faac0a26bc from this chassis (sb_readonly=0)
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.483 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ca5a0748-2268-4f31-a673-9ef2606c4273.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ca5a0748-2268-4f31-a673-9ef2606c4273.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.484 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3d953b3c-df09-4a3b-a118-80d8bfb6d1ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.485 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-ca5a0748-2268-4f31-a673-9ef2606c4273
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/ca5a0748-2268-4f31-a673-9ef2606c4273.pid.haproxy
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID ca5a0748-2268-4f31-a673-9ef2606c4273
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 06:59:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:43.485 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'env', 'PROCESS_TAG=haproxy-ca5a0748-2268-4f31-a673-9ef2606c4273', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ca5a0748-2268-4f31-a673-9ef2606c4273.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 06:59:43 np0005546909 nova_compute[187208]: 2025-12-05 11:59:43.494 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:43 np0005546909 podman[214258]: 2025-12-05 11:59:43.872884196 +0000 UTC m=+0.057640237 container create 99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  5 06:59:43 np0005546909 systemd[1]: Started libpod-conmon-99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31.scope.
Dec  5 06:59:43 np0005546909 podman[214258]: 2025-12-05 11:59:43.842819139 +0000 UTC m=+0.027575190 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 06:59:43 np0005546909 nova_compute[187208]: 2025-12-05 11:59:43.938 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Successfully updated port: 0d1b5558-6557-43e9-8cac-a00b4e97ea8b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 06:59:43 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:59:43 np0005546909 nova_compute[187208]: 2025-12-05 11:59:43.953 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "refresh_cache-b2e8212c-084c-4a4f-b930-56560ae4da12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:43 np0005546909 nova_compute[187208]: 2025-12-05 11:59:43.953 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquired lock "refresh_cache-b2e8212c-084c-4a4f-b930-56560ae4da12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:43 np0005546909 nova_compute[187208]: 2025-12-05 11:59:43.954 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 06:59:43 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77b012e43fc6df7609492b693cc8452628271339171d87e515263e44dc855891/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 06:59:43 np0005546909 podman[214258]: 2025-12-05 11:59:43.968834555 +0000 UTC m=+0.153590596 container init 99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec  5 06:59:43 np0005546909 podman[214258]: 2025-12-05 11:59:43.974191982 +0000 UTC m=+0.158948013 container start 99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 06:59:43 np0005546909 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [NOTICE]   (214277) : New worker (214279) forked
Dec  5 06:59:43 np0005546909 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [NOTICE]   (214277) : Loading success.
Dec  5 06:59:44 np0005546909 nova_compute[187208]: 2025-12-05 11:59:44.379 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:44 np0005546909 nova_compute[187208]: 2025-12-05 11:59:44.611 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Successfully updated port: 06886ab7-aa74-4f44-b509-94e27d585818 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 06:59:44 np0005546909 nova_compute[187208]: 2025-12-05 11:59:44.650 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "refresh_cache-04518502-62f1-44c3-8c57-b3404958536f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:44 np0005546909 nova_compute[187208]: 2025-12-05 11:59:44.650 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquired lock "refresh_cache-04518502-62f1-44c3-8c57-b3404958536f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:44 np0005546909 nova_compute[187208]: 2025-12-05 11:59:44.651 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 06:59:44 np0005546909 nova_compute[187208]: 2025-12-05 11:59:44.983 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:44 np0005546909 nova_compute[187208]: 2025-12-05 11:59:44.984 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.050 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.117 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.146 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.147 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.163 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.163 187212 INFO nova.compute.claims [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.376 187212 DEBUG nova.compute.provider_tree [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.445 187212 DEBUG nova.scheduler.client.report [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.482 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.483 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.535 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.535 187212 DEBUG nova.network.neutron [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.555 187212 INFO nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.571 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.656 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.658 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.658 187212 INFO nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Creating image(s)#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.659 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "/var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.659 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "/var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.660 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "/var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.672 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.729 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.731 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.732 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.742 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.804 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.805 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.839 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.840 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.840 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.903 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.905 187212 DEBUG nova.virt.disk.api [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Checking if we can resize image /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.906 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.957 187212 DEBUG nova.policy [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.967 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.968 187212 DEBUG nova.virt.disk.api [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Cannot resize image /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 06:59:45 np0005546909 nova_compute[187208]: 2025-12-05 11:59:45.969 187212 DEBUG nova.objects.instance [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lazy-loading 'migration_context' on Instance uuid 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:46 np0005546909 nova_compute[187208]: 2025-12-05 11:59:46.077 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 06:59:46 np0005546909 nova_compute[187208]: 2025-12-05 11:59:46.078 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Ensure instance console log exists: /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 06:59:46 np0005546909 nova_compute[187208]: 2025-12-05 11:59:46.079 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:46 np0005546909 nova_compute[187208]: 2025-12-05 11:59:46.079 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:46 np0005546909 nova_compute[187208]: 2025-12-05 11:59:46.079 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:46 np0005546909 nova_compute[187208]: 2025-12-05 11:59:46.135 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:46 np0005546909 nova_compute[187208]: 2025-12-05 11:59:46.151 187212 DEBUG nova.compute.manager [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received event network-changed-0d1b5558-6557-43e9-8cac-a00b4e97ea8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 06:59:46 np0005546909 nova_compute[187208]: 2025-12-05 11:59:46.152 187212 DEBUG nova.compute.manager [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Refreshing instance network info cache due to event network-changed-0d1b5558-6557-43e9-8cac-a00b4e97ea8b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 06:59:46 np0005546909 nova_compute[187208]: 2025-12-05 11:59:46.152 187212 DEBUG oslo_concurrency.lockutils [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-b2e8212c-084c-4a4f-b930-56560ae4da12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:46 np0005546909 podman[214303]: 2025-12-05 11:59:46.207626351 +0000 UTC m=+0.058030217 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  5 06:59:46 np0005546909 nova_compute[187208]: 2025-12-05 11:59:46.345 187212 DEBUG nova.compute.manager [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-changed-06886ab7-aa74-4f44-b509-94e27d585818 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 06:59:46 np0005546909 nova_compute[187208]: 2025-12-05 11:59:46.346 187212 DEBUG nova.compute.manager [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Refreshing instance network info cache due to event network-changed-06886ab7-aa74-4f44-b509-94e27d585818. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 06:59:46 np0005546909 nova_compute[187208]: 2025-12-05 11:59:46.346 187212 DEBUG oslo_concurrency.lockutils [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-04518502-62f1-44c3-8c57-b3404958536f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:47 np0005546909 nova_compute[187208]: 2025-12-05 11:59:47.170 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.463 187212 DEBUG nova.network.neutron [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Successfully created port: 9275d01b-3eb9-429b-a0ba-0cb60048987a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.466 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Updating instance_info_cache with network_info: [{"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.484 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Releasing lock "refresh_cache-b2e8212c-084c-4a4f-b930-56560ae4da12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.485 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Instance network_info: |[{"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.485 187212 DEBUG oslo_concurrency.lockutils [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-b2e8212c-084c-4a4f-b930-56560ae4da12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.485 187212 DEBUG nova.network.neutron [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Refreshing network info cache for port 0d1b5558-6557-43e9-8cac-a00b4e97ea8b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.488 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Start _get_guest_xml network_info=[{"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.493 187212 WARNING nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.498 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.499 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.509 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.509 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.510 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.510 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.511 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.511 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.511 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.511 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.512 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.512 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.512 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.512 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.512 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.513 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.517 187212 DEBUG nova.virt.libvirt.vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-3',id=6,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:58:55Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=b2e8212c-084c-4a4f-b930-56560ae4da12,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.518 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.519 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.520 187212 DEBUG nova.objects.instance [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'pci_devices' on Instance uuid b2e8212c-084c-4a4f-b930-56560ae4da12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.539 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] End _get_guest_xml xml=<domain type="kvm">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <uuid>b2e8212c-084c-4a4f-b930-56560ae4da12</uuid>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <name>instance-00000006</name>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:name>tempest-tempest.common.compute-instance-445293436-3</nova:name>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 11:59:48</nova:creationTime>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:user uuid="c4c62f22ba09455995ea1bde6a93431e">tempest-AutoAllocateNetworkTest-275048159-project-member</nova:user>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:project uuid="fb2c9c006bee4723bc8dd108e19a6728">tempest-AutoAllocateNetworkTest-275048159</nova:project>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:port uuid="0d1b5558-6557-43e9-8cac-a00b4e97ea8b">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.1.0.6" ipVersion="4"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="fdfe:381f:8400::100" ipVersion="6"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <system>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <entry name="serial">b2e8212c-084c-4a4f-b930-56560ae4da12</entry>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <entry name="uuid">b2e8212c-084c-4a4f-b930-56560ae4da12</entry>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </system>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <os>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </clock>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.config"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:05:76:3a"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <target dev="tap0d1b5558-65"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </interface>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/console.log" append="off"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </serial>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <video>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:59:48 np0005546909 nova_compute[187208]: </domain>
Dec  5 06:59:48 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.539 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Preparing to wait for external event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.539 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.540 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.540 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.540 187212 DEBUG nova.virt.libvirt.vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-3',id=6,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:58:55Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=b2e8212c-084c-4a4f-b930-56560ae4da12,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.541 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.541 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.542 187212 DEBUG os_vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.542 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.543 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.543 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.545 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.546 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d1b5558-65, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.546 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0d1b5558-65, col_values=(('external_ids', {'iface-id': '0d1b5558-6557-43e9-8cac-a00b4e97ea8b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:76:3a', 'vm-uuid': 'b2e8212c-084c-4a4f-b930-56560ae4da12'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:48 np0005546909 NetworkManager[55691]: <info>  [1764935988.5484] manager: (tap0d1b5558-65): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.547 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.549 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.554 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.555 187212 INFO os_vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65')#033[00m
Dec  5 06:59:48 np0005546909 podman[214323]: 2025-12-05 11:59:48.565329658 +0000 UTC m=+0.065931945 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.616 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.617 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.617 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No VIF found with MAC fa:16:3e:05:76:3a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.617 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Using config drive#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.662 187212 DEBUG nova.network.neutron [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Updating instance_info_cache with network_info: [{"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.678 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Releasing lock "refresh_cache-04518502-62f1-44c3-8c57-b3404958536f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.678 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Instance network_info: |[{"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.678 187212 DEBUG oslo_concurrency.lockutils [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-04518502-62f1-44c3-8c57-b3404958536f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.678 187212 DEBUG nova.network.neutron [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Refreshing network info cache for port 06886ab7-aa74-4f44-b509-94e27d585818 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.681 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Start _get_guest_xml network_info=[{"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.684 187212 WARNING nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.689 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.689 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.692 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.692 187212 DEBUG nova.virt.libvirt.host [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.692 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.692 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.693 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.693 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.693 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.693 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.693 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.693 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.694 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.694 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.694 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.694 187212 DEBUG nova.virt.hardware [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.697 187212 DEBUG nova.virt.libvirt.vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-2',id=5,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:58:55Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=04518502-62f1-44c3-8c57-b3404958536f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.697 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.698 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.698 187212 DEBUG nova.objects.instance [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'pci_devices' on Instance uuid 04518502-62f1-44c3-8c57-b3404958536f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.713 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] End _get_guest_xml xml=<domain type="kvm">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <uuid>04518502-62f1-44c3-8c57-b3404958536f</uuid>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <name>instance-00000005</name>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:name>tempest-tempest.common.compute-instance-445293436-2</nova:name>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 11:59:48</nova:creationTime>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:user uuid="c4c62f22ba09455995ea1bde6a93431e">tempest-AutoAllocateNetworkTest-275048159-project-member</nova:user>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:project uuid="fb2c9c006bee4723bc8dd108e19a6728">tempest-AutoAllocateNetworkTest-275048159</nova:project>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        <nova:port uuid="06886ab7-aa74-4f44-b509-94e27d585818">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.1.0.8" ipVersion="4"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="fdfe:381f:8400::241" ipVersion="6"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <system>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <entry name="serial">04518502-62f1-44c3-8c57-b3404958536f</entry>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <entry name="uuid">04518502-62f1-44c3-8c57-b3404958536f</entry>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </system>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <os>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </clock>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.config"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:61:58:b9"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <target dev="tap06886ab7-aa"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </interface>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/console.log" append="off"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </serial>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <video>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 06:59:48 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 06:59:48 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:59:48 np0005546909 nova_compute[187208]: </domain>
Dec  5 06:59:48 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.714 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Preparing to wait for external event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.714 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.714 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.714 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.715 187212 DEBUG nova.virt.libvirt.vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-2',id=5,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:58:55Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=04518502-62f1-44c3-8c57-b3404958536f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.715 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.716 187212 DEBUG nova.network.os_vif_util [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.716 187212 DEBUG os_vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.717 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.717 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.717 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.719 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.720 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06886ab7-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.720 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06886ab7-aa, col_values=(('external_ids', {'iface-id': '06886ab7-aa74-4f44-b509-94e27d585818', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:61:58:b9', 'vm-uuid': '04518502-62f1-44c3-8c57-b3404958536f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:48 np0005546909 NetworkManager[55691]: <info>  [1764935988.7222] manager: (tap06886ab7-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.721 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.724 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.728 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.729 187212 INFO os_vif [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa')#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.770 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.771 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.771 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] No VIF found with MAC fa:16:3e:61:58:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 06:59:48 np0005546909 nova_compute[187208]: 2025-12-05 11:59:48.772 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Using config drive#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.131 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Creating config drive at /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.config#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.137 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgmk4fwg1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.186 187212 INFO nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Creating config drive at /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.config#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.191 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6uro0gci execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.269 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgmk4fwg1" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.314 187212 DEBUG oslo_concurrency.processutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6uro0gci" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:49 np0005546909 NetworkManager[55691]: <info>  [1764935989.3332] manager: (tap06886ab7-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Dec  5 06:59:49 np0005546909 kernel: tap06886ab7-aa: entered promiscuous mode
Dec  5 06:59:49 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:49Z|00038|binding|INFO|Claiming lport 06886ab7-aa74-4f44-b509-94e27d585818 for this chassis.
Dec  5 06:59:49 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:49Z|00039|binding|INFO|06886ab7-aa74-4f44-b509-94e27d585818: Claiming fa:16:3e:61:58:b9 10.1.0.8 fdfe:381f:8400::241
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.340 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.349 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:58:b9 10.1.0.8 fdfe:381f:8400::241'], port_security=['fa:16:3e:61:58:b9 10.1.0.8 fdfe:381f:8400::241'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.8/26 fdfe:381f:8400::241/64', 'neutron:device_id': '04518502-62f1-44c3-8c57-b3404958536f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca5a0748-2268-4f31-a673-9ef2606c4273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d28d43c-0f17-4a95-87c9-620fe47e764a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a93d84c4-2884-48aa-b436-9baea579d840, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=06886ab7-aa74-4f44-b509-94e27d585818) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.350 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 06886ab7-aa74-4f44-b509-94e27d585818 in datapath ca5a0748-2268-4f31-a673-9ef2606c4273 bound to our chassis#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.352 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca5a0748-2268-4f31-a673-9ef2606c4273#033[00m
Dec  5 06:59:49 np0005546909 systemd-udevd[214375]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.373 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ba8b4c09-27d7-45b9-bbba-8ffcb895c025]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:49 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:49Z|00040|binding|INFO|Setting lport 06886ab7-aa74-4f44-b509-94e27d585818 ovn-installed in OVS
Dec  5 06:59:49 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:49Z|00041|binding|INFO|Setting lport 06886ab7-aa74-4f44-b509-94e27d585818 up in Southbound
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.381 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:49 np0005546909 kernel: tap0d1b5558-65: entered promiscuous mode
Dec  5 06:59:49 np0005546909 NetworkManager[55691]: <info>  [1764935989.3863] manager: (tap0d1b5558-65): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Dec  5 06:59:49 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:49Z|00042|binding|INFO|Claiming lport 0d1b5558-6557-43e9-8cac-a00b4e97ea8b for this chassis.
Dec  5 06:59:49 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:49Z|00043|binding|INFO|0d1b5558-6557-43e9-8cac-a00b4e97ea8b: Claiming fa:16:3e:05:76:3a 10.1.0.6 fdfe:381f:8400::100
Dec  5 06:59:49 np0005546909 systemd-udevd[214385]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.389 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:49 np0005546909 NetworkManager[55691]: <info>  [1764935989.3945] device (tap06886ab7-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:59:49 np0005546909 NetworkManager[55691]: <info>  [1764935989.3962] device (tap06886ab7-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.393 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:76:3a 10.1.0.6 fdfe:381f:8400::100'], port_security=['fa:16:3e:05:76:3a 10.1.0.6 fdfe:381f:8400::100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.6/26 fdfe:381f:8400::100/64', 'neutron:device_id': 'b2e8212c-084c-4a4f-b930-56560ae4da12', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca5a0748-2268-4f31-a673-9ef2606c4273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d28d43c-0f17-4a95-87c9-620fe47e764a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a93d84c4-2884-48aa-b436-9baea579d840, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0d1b5558-6557-43e9-8cac-a00b4e97ea8b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 06:59:49 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:49Z|00044|binding|INFO|Setting lport 0d1b5558-6557-43e9-8cac-a00b4e97ea8b ovn-installed in OVS
Dec  5 06:59:49 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:49Z|00045|binding|INFO|Setting lport 0d1b5558-6557-43e9-8cac-a00b4e97ea8b up in Southbound
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.402 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:49 np0005546909 NetworkManager[55691]: <info>  [1764935989.4049] device (tap0d1b5558-65): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:59:49 np0005546909 NetworkManager[55691]: <info>  [1764935989.4058] device (tap0d1b5558-65): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.405 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.406 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[37cde7ab-37ca-414f-9d51-81fef53dcd98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.409 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.410 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[84365e37-eb13-46f3-bcca-5ae955efee08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:49 np0005546909 systemd-machined[153543]: New machine qemu-8-instance-00000005.
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.436 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b88566b5-2230-4744-8423-cf457fe22554]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:49 np0005546909 systemd[1]: Started Virtual Machine qemu-8-instance-00000005.
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.453 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[81180780-57b1-4a0c-a5d9-c4ee4e41602a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca5a0748-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:49:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336387, 'reachable_time': 27307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214393, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.467 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[416c6d41-d5dc-467e-b39b-a3df0682fcd2]: (4, ({'family': 2, 'prefixlen': 26, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.0.2'], ['IFA_LOCAL', '10.1.0.2'], ['IFA_BROADCAST', '10.1.0.63'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336400, 'tstamp': 336400}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214396, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336403, 'tstamp': 336403}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214396, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:49 np0005546909 systemd-machined[153543]: New machine qemu-9-instance-00000006.
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.468 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca5a0748-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:49 np0005546909 systemd[1]: Started Virtual Machine qemu-9-instance-00000006.
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.469 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.471 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca5a0748-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.471 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.471 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.472 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca5a0748-20, col_values=(('external_ids', {'iface-id': '4248cb8a-d980-4682-8c47-d6faac0a26bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.472 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.473 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0d1b5558-6557-43e9-8cac-a00b4e97ea8b in datapath ca5a0748-2268-4f31-a673-9ef2606c4273 unbound from our chassis#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.475 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca5a0748-2268-4f31-a673-9ef2606c4273#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.487 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[70f242d4-a54d-4cff-98a0-b3dabb4f6e6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.510 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[65e56bc7-5806-430d-b85c-d128e4aebf27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.513 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d56ea429-b1aa-435b-bc29-71cf1cd0d7b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.536 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c5830703-d0ed-40f8-bb69-ee7a8fc2741e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.554 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6c965220-76a4-4ae2-9c42-d1ceebb2ce41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca5a0748-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:49:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336387, 'reachable_time': 27307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214415, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.574 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72627b1d-ff34-4149-9bbb-cf53b8e54c03]: (4, ({'family': 2, 'prefixlen': 26, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.0.2'], ['IFA_LOCAL', '10.1.0.2'], ['IFA_BROADCAST', '10.1.0.63'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336400, 'tstamp': 336400}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214416, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336403, 'tstamp': 336403}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214416, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.576 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca5a0748-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.579 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca5a0748-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.579 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.579 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca5a0748-20, col_values=(('external_ids', {'iface-id': '4248cb8a-d980-4682-8c47-d6faac0a26bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:49.579 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.745 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935989.7450576, 04518502-62f1-44c3-8c57-b3404958536f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.746 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] VM Started (Lifecycle Event)#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.763 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.769 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935989.7452853, 04518502-62f1-44c3-8c57-b3404958536f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.769 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] VM Paused (Lifecycle Event)#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.786 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.792 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.811 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.840 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935989.840538, b2e8212c-084c-4a4f-b930-56560ae4da12 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.841 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] VM Started (Lifecycle Event)#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.862 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.865 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935989.8428938, b2e8212c-084c-4a4f-b930-56560ae4da12 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.865 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] VM Paused (Lifecycle Event)#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.887 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.889 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:49 np0005546909 nova_compute[187208]: 2025-12-05 11:59:49.915 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:50 np0005546909 nova_compute[187208]: 2025-12-05 11:59:50.217 187212 DEBUG nova.network.neutron [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Updated VIF entry in instance network info cache for port 06886ab7-aa74-4f44-b509-94e27d585818. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 06:59:50 np0005546909 nova_compute[187208]: 2025-12-05 11:59:50.218 187212 DEBUG nova.network.neutron [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Updating instance_info_cache with network_info: [{"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:50 np0005546909 nova_compute[187208]: 2025-12-05 11:59:50.234 187212 DEBUG oslo_concurrency.lockutils [req-ef8ffcae-cb0d-4512-825c-161aab4b2b91 req-e3afc858-c423-4953-8ccf-e532b44021e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-04518502-62f1-44c3-8c57-b3404958536f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:50 np0005546909 nova_compute[187208]: 2025-12-05 11:59:50.750 187212 DEBUG nova.network.neutron [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Successfully updated port: 9275d01b-3eb9-429b-a0ba-0cb60048987a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 06:59:50 np0005546909 nova_compute[187208]: 2025-12-05 11:59:50.767 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:50 np0005546909 nova_compute[187208]: 2025-12-05 11:59:50.767 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquired lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:50 np0005546909 nova_compute[187208]: 2025-12-05 11:59:50.768 187212 DEBUG nova.network.neutron [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 06:59:50 np0005546909 nova_compute[187208]: 2025-12-05 11:59:50.930 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:50 np0005546909 nova_compute[187208]: 2025-12-05 11:59:50.930 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:50 np0005546909 nova_compute[187208]: 2025-12-05 11:59:50.956 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.008 187212 DEBUG nova.network.neutron [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.042 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.043 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.049 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.049 187212 INFO nova.compute.claims [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.298 187212 DEBUG nova.compute.provider_tree [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.313 187212 DEBUG nova.scheduler.client.report [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.336 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.336 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.394 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.394 187212 DEBUG nova.network.neutron [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.411 187212 INFO nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.425 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.507 187212 DEBUG nova.network.neutron [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Updated VIF entry in instance network info cache for port 0d1b5558-6557-43e9-8cac-a00b4e97ea8b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.508 187212 DEBUG nova.network.neutron [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Updating instance_info_cache with network_info: [{"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.521 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.523 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.524 187212 INFO nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Creating image(s)#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.525 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "/var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.525 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "/var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.526 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "/var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.556 187212 DEBUG oslo_concurrency.lockutils [req-9a5153e7-ea76-43d8-9d6c-8beeb57fc28a req-651e307d-3b10-4f95-9148-97b17d0ff2d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-b2e8212c-084c-4a4f-b930-56560ae4da12" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.558 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.623 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.624 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.625 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.654 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.721 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.723 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.768 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.770 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.771 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.845 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.847 187212 DEBUG nova.virt.disk.api [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Checking if we can resize image /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.849 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.871 187212 DEBUG nova.network.neutron [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.872 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.908 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.910 187212 DEBUG nova.virt.disk.api [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Cannot resize image /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.910 187212 DEBUG nova.objects.instance [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lazy-loading 'migration_context' on Instance uuid 5150eaf5-c0ca-48ab-9045-af5a1c785c8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.924 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.925 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Ensure instance console log exists: /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.925 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.926 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.926 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.928 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.931 187212 DEBUG nova.compute.manager [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-changed-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.932 187212 DEBUG nova.compute.manager [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Refreshing instance network info cache due to event network-changed-9275d01b-3eb9-429b-a0ba-0cb60048987a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.932 187212 DEBUG oslo_concurrency.lockutils [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.937 187212 WARNING nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.942 187212 DEBUG nova.virt.libvirt.host [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.943 187212 DEBUG nova.virt.libvirt.host [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.945 187212 DEBUG nova.virt.libvirt.host [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.946 187212 DEBUG nova.virt.libvirt.host [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.946 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.946 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.947 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.947 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.948 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.948 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.948 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.948 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.949 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.949 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.949 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.949 187212 DEBUG nova.virt.hardware [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.953 187212 DEBUG nova.objects.instance [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5150eaf5-c0ca-48ab-9045-af5a1c785c8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:51 np0005546909 nova_compute[187208]: 2025-12-05 11:59:51.966 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] End _get_guest_xml xml=<domain type="kvm">
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  <uuid>5150eaf5-c0ca-48ab-9045-af5a1c785c8e</uuid>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  <name>instance-0000000b</name>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <nova:name>tempest-LiveMigrationNegativeTest-server-865064456</nova:name>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 11:59:51</nova:creationTime>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 06:59:51 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:        <nova:user uuid="28407300b110465d9748f60fa4ee4945">tempest-LiveMigrationNegativeTest-1771731310-project-member</nova:user>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:        <nova:project uuid="6592a6d983f44d9e94749f0e3e94c689">tempest-LiveMigrationNegativeTest-1771731310</nova:project>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <system>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <entry name="serial">5150eaf5-c0ca-48ab-9045-af5a1c785c8e</entry>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <entry name="uuid">5150eaf5-c0ca-48ab-9045-af5a1c785c8e</entry>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    </system>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  <os>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  </clock>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.config"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/console.log" append="off"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    </serial>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <video>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 06:59:51 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 06:59:51 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:59:51 np0005546909 nova_compute[187208]: </domain>
Dec  5 06:59:51 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.038 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.038 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.039 187212 INFO nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Using config drive#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.173 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.439 187212 INFO nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Creating config drive at /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.config#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.451 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjteqlr9h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.520 187212 DEBUG nova.network.neutron [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updating instance_info_cache with network_info: [{"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.542 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Releasing lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.543 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Instance network_info: |[{"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.543 187212 DEBUG oslo_concurrency.lockutils [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.544 187212 DEBUG nova.network.neutron [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Refreshing network info cache for port 9275d01b-3eb9-429b-a0ba-0cb60048987a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.549 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Start _get_guest_xml network_info=[{"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.555 187212 WARNING nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.561 187212 DEBUG nova.virt.libvirt.host [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.562 187212 DEBUG nova.virt.libvirt.host [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.571 187212 DEBUG nova.virt.libvirt.host [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.572 187212 DEBUG nova.virt.libvirt.host [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.573 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.574 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.575 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.576 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.577 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.577 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.578 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.579 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.579 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.580 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.580 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.580 187212 DEBUG nova.virt.hardware [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.585 187212 DEBUG nova.virt.libvirt.vif [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:59:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-147223876',id=10,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16d2f26b00364f84b1702bb7219b8d31',ramdisk_id='',reservation_id='r-5f154sc3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-4920441',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-4920441-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:59:45Z,user_data=None,user_id='d4754b88440a4ea08a37067ef9234672',uuid=597f2994-fdad-46b1-9ef7-f56d62b4bbd0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.585 187212 DEBUG nova.network.os_vif_util [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Converting VIF {"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.586 187212 DEBUG nova.network.os_vif_util [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.587 187212 DEBUG nova.objects.instance [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lazy-loading 'pci_devices' on Instance uuid 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.588 187212 DEBUG oslo_concurrency.processutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjteqlr9h" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.603 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] End _get_guest_xml xml=<domain type="kvm">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  <uuid>597f2994-fdad-46b1-9ef7-f56d62b4bbd0</uuid>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  <name>instance-0000000a</name>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <nova:name>tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876</nova:name>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 11:59:52</nova:creationTime>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:        <nova:user uuid="d4754b88440a4ea08a37067ef9234672">tempest-FloatingIPsAssociationNegativeTestJSON-4920441-project-member</nova:user>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:        <nova:project uuid="16d2f26b00364f84b1702bb7219b8d31">tempest-FloatingIPsAssociationNegativeTestJSON-4920441</nova:project>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:        <nova:port uuid="9275d01b-3eb9-429b-a0ba-0cb60048987a">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <system>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <entry name="serial">597f2994-fdad-46b1-9ef7-f56d62b4bbd0</entry>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <entry name="uuid">597f2994-fdad-46b1-9ef7-f56d62b4bbd0</entry>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    </system>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  <os>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  </os>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  <features>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  </features>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  </clock>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  <devices>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.config"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    </disk>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:f5:93:9d"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <target dev="tap9275d01b-3e"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    </interface>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/console.log" append="off"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    </serial>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <video>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    </video>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    </rng>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 06:59:52 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 06:59:52 np0005546909 nova_compute[187208]:  </devices>
Dec  5 06:59:52 np0005546909 nova_compute[187208]: </domain>
Dec  5 06:59:52 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.603 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Preparing to wait for external event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.604 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.604 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.604 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.605 187212 DEBUG nova.virt.libvirt.vif [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:59:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-147223876',id=10,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16d2f26b00364f84b1702bb7219b8d31',ramdisk_id='',reservation_id='r-5f154sc3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-4920441',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-4920441-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:59:45Z,user_data=None,user_id='d4754b88440a4ea08a37067ef9234672',uuid=597f2994-fdad-46b1-9ef7-f56d62b4bbd0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.605 187212 DEBUG nova.network.os_vif_util [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Converting VIF {"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.606 187212 DEBUG nova.network.os_vif_util [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.606 187212 DEBUG os_vif [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.607 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.607 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.607 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.610 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.610 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9275d01b-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.610 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9275d01b-3e, col_values=(('external_ids', {'iface-id': '9275d01b-3eb9-429b-a0ba-0cb60048987a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f5:93:9d', 'vm-uuid': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.611 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:52 np0005546909 NetworkManager[55691]: <info>  [1764935992.6125] manager: (tap9275d01b-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.614 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.617 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.618 187212 INFO os_vif [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e')#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.668 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.669 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.669 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] No VIF found with MAC fa:16:3e:f5:93:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 06:59:52 np0005546909 nova_compute[187208]: 2025-12-05 11:59:52.670 187212 INFO nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Using config drive#033[00m
Dec  5 06:59:52 np0005546909 systemd-machined[153543]: New machine qemu-10-instance-0000000b.
Dec  5 06:59:52 np0005546909 systemd[1]: Started Virtual Machine qemu-10-instance-0000000b.
Dec  5 06:59:52 np0005546909 podman[214459]: 2025-12-05 11:59:52.71025557 +0000 UTC m=+0.061526423 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 06:59:52 np0005546909 podman[214460]: 2025-12-05 11:59:52.745939902 +0000 UTC m=+0.095710524 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.112 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935993.1117496, 5150eaf5-c0ca-48ab-9045-af5a1c785c8e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.113 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] VM Resumed (Lifecycle Event)#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.116 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.117 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.120 187212 INFO nova.virt.libvirt.driver [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Instance spawned successfully.#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.121 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.127 187212 INFO nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Creating config drive at /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.config#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.133 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmgdy85fq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.151 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.157 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.161 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.162 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.162 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.163 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.163 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.164 187212 DEBUG nova.virt.libvirt.driver [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.196 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.197 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935993.1165237, 5150eaf5-c0ca-48ab-9045-af5a1c785c8e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.197 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] VM Started (Lifecycle Event)#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.228 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.233 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.242 187212 INFO nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Took 1.72 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.243 187212 DEBUG nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.254 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.256 187212 DEBUG oslo_concurrency.processutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmgdy85fq" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.316 187212 INFO nova.compute.manager [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Took 2.30 seconds to build instance.#033[00m
Dec  5 06:59:53 np0005546909 kernel: tap9275d01b-3e: entered promiscuous mode
Dec  5 06:59:53 np0005546909 NetworkManager[55691]: <info>  [1764935993.3235] manager: (tap9275d01b-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Dec  5 06:59:53 np0005546909 systemd-udevd[214523]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.324 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:53 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:53Z|00046|binding|INFO|Claiming lport 9275d01b-3eb9-429b-a0ba-0cb60048987a for this chassis.
Dec  5 06:59:53 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:53Z|00047|binding|INFO|9275d01b-3eb9-429b-a0ba-0cb60048987a: Claiming fa:16:3e:f5:93:9d 10.100.0.7
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.330 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.333 187212 DEBUG oslo_concurrency.lockutils [None req-706cc622-8f06-4e6b-ad74-6cc19543c39c 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.335 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.340 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:93:9d 10.100.0.7'], port_security=['fa:16:3e:f5:93:9d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d2f26b00364f84b1702bb7219b8d31', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ba7f2e39-8114-45e5-bd44-4ae84ab46fc6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd38fa62-d49e-4607-8d3e-179b767c8786, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9275d01b-3eb9-429b-a0ba-0cb60048987a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.341 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9275d01b-3eb9-429b-a0ba-0cb60048987a in datapath e5a9559e-b860-47a2-b44b-45c7f67f2119 bound to our chassis#033[00m
Dec  5 06:59:53 np0005546909 NetworkManager[55691]: <info>  [1764935993.3436] device (tap9275d01b-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 06:59:53 np0005546909 NetworkManager[55691]: <info>  [1764935993.3447] device (tap9275d01b-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.346 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e5a9559e-b860-47a2-b44b-45c7f67f2119#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.361 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d55da232-a444-4415-a03d-5233eb83894e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.361 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape5a9559e-b1 in ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 06:59:53 np0005546909 systemd-machined[153543]: New machine qemu-11-instance-0000000a.
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.368 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape5a9559e-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.368 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a79f120b-9557-4c47-99db-ff705c6abc62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.369 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c3be518c-1f1e-4547-9570-0ea33e7a0ae2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 systemd[1]: Started Virtual Machine qemu-11-instance-0000000a.
Dec  5 06:59:53 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:53Z|00048|binding|INFO|Setting lport 9275d01b-3eb9-429b-a0ba-0cb60048987a ovn-installed in OVS
Dec  5 06:59:53 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:53Z|00049|binding|INFO|Setting lport 9275d01b-3eb9-429b-a0ba-0cb60048987a up in Southbound
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.394 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b705c481-30d1-4d27-8686-49b4aebf26cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.396 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.412 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[06573a8f-f3f5-49c9-b5c8-ee65f0dcbd3f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.466 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9a05c7e9-84ba-4cee-92fd-90c61a63709d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 NetworkManager[55691]: <info>  [1764935993.4754] manager: (tape5a9559e-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.473 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[82d80d9f-3d22-4a57-87ce-4c1a9ef5de4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.521 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3824b5f1-f11d-428c-aa1d-5ce179113b12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.525 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[37c24756-4840-4ee7-ac1a-07c9e27b7224]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 NetworkManager[55691]: <info>  [1764935993.5603] device (tape5a9559e-b0): carrier: link connected
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.567 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff99cd9-0f25-4d2b-b548-323a60d84680]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.593 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c309175f-8ecd-4327-836a-fe5bad8cc34c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape5a9559e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:26:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337412, 'reachable_time': 24464, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214576, 'error': None, 'target': 'ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.615 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d02d49b3-04a2-4ad9-896b-600e43f29f4d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:26b8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 337412, 'tstamp': 337412}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214577, 'error': None, 'target': 'ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.641 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[619cceeb-dd3e-4ee4-b54e-0303cacbe918]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape5a9559e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:26:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337412, 'reachable_time': 24464, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214578, 'error': None, 'target': 'ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.699 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0322fcea-3785-4df9-bd1c-0fd4c8e90fa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.777 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d2fad411-5aa3-4ddf-b4af-4424d9fd34ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.778 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5a9559e-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.778 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.778 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5a9559e-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.780 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:53 np0005546909 NetworkManager[55691]: <info>  [1764935993.7813] manager: (tape5a9559e-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Dec  5 06:59:53 np0005546909 kernel: tape5a9559e-b0: entered promiscuous mode
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.785 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape5a9559e-b0, col_values=(('external_ids', {'iface-id': '79bf1a96-6e90-41b7-8356-9756185de59f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 06:59:53 np0005546909 ovn_controller[95610]: 2025-12-05T11:59:53Z|00050|binding|INFO|Releasing lport 79bf1a96-6e90-41b7-8356-9756185de59f from this chassis (sb_readonly=0)
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.786 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:53 np0005546909 nova_compute[187208]: 2025-12-05 11:59:53.807 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.808 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e5a9559e-b860-47a2-b44b-45c7f67f2119.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e5a9559e-b860-47a2-b44b-45c7f67f2119.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.816 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[81b6f407-2650-4358-9f51-b1765d612ce1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.817 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-e5a9559e-b860-47a2-b44b-45c7f67f2119
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/e5a9559e-b860-47a2-b44b-45c7f67f2119.pid.haproxy
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID e5a9559e-b860-47a2-b44b-45c7f67f2119
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 06:59:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 11:59:53.817 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'env', 'PROCESS_TAG=haproxy-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e5a9559e-b860-47a2-b44b-45c7f67f2119.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 06:59:54 np0005546909 nova_compute[187208]: 2025-12-05 11:59:54.312 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935994.3124888, 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:54 np0005546909 nova_compute[187208]: 2025-12-05 11:59:54.313 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] VM Started (Lifecycle Event)#033[00m
Dec  5 06:59:54 np0005546909 podman[214614]: 2025-12-05 11:59:54.379680027 +0000 UTC m=+0.052311870 container create 24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec  5 06:59:54 np0005546909 podman[214614]: 2025-12-05 11:59:54.348767637 +0000 UTC m=+0.021399510 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 06:59:54 np0005546909 nova_compute[187208]: 2025-12-05 11:59:54.598 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:54 np0005546909 nova_compute[187208]: 2025-12-05 11:59:54.609 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935994.312636, 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:54 np0005546909 nova_compute[187208]: 2025-12-05 11:59:54.610 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] VM Paused (Lifecycle Event)#033[00m
Dec  5 06:59:54 np0005546909 nova_compute[187208]: 2025-12-05 11:59:54.657 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:54 np0005546909 nova_compute[187208]: 2025-12-05 11:59:54.661 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:54 np0005546909 systemd[1]: Started libpod-conmon-24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c.scope.
Dec  5 06:59:54 np0005546909 nova_compute[187208]: 2025-12-05 11:59:54.678 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:54 np0005546909 systemd[1]: Started libcrun container.
Dec  5 06:59:54 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15ac9401aab02c277b66f0b8b3e087367793eb4fcc0a66aa2c56cb8b76ba06f3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 06:59:54 np0005546909 podman[214614]: 2025-12-05 11:59:54.736680626 +0000 UTC m=+0.409312469 container init 24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 06:59:54 np0005546909 podman[214614]: 2025-12-05 11:59:54.746510286 +0000 UTC m=+0.419142119 container start 24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 06:59:54 np0005546909 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [NOTICE]   (214635) : New worker (214637) forked
Dec  5 06:59:54 np0005546909 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [NOTICE]   (214635) : Loading success.
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.270 187212 DEBUG nova.network.neutron [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updated VIF entry in instance network info cache for port 9275d01b-3eb9-429b-a0ba-0cb60048987a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.271 187212 DEBUG nova.network.neutron [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updating instance_info_cache with network_info: [{"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.286 187212 DEBUG oslo_concurrency.lockutils [req-cc80271a-b461-44de-9abb-371f5b0c41cf req-87daad50-0bd0-4dcc-b93f-09048e40bae3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.449 187212 DEBUG nova.compute.manager [req-5f75d2ef-30b0-477a-bb28-a0234de1f004 req-b4b6cd9a-a428-4000-b077-e608188a5b32 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Received event network-vif-plugged-a5ad03eb-1959-4b2d-a437-979506e6b988 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.449 187212 DEBUG oslo_concurrency.lockutils [req-5f75d2ef-30b0-477a-bb28-a0234de1f004 req-b4b6cd9a-a428-4000-b077-e608188a5b32 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.450 187212 DEBUG oslo_concurrency.lockutils [req-5f75d2ef-30b0-477a-bb28-a0234de1f004 req-b4b6cd9a-a428-4000-b077-e608188a5b32 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.450 187212 DEBUG oslo_concurrency.lockutils [req-5f75d2ef-30b0-477a-bb28-a0234de1f004 req-b4b6cd9a-a428-4000-b077-e608188a5b32 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.450 187212 DEBUG nova.compute.manager [req-5f75d2ef-30b0-477a-bb28-a0234de1f004 req-b4b6cd9a-a428-4000-b077-e608188a5b32 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Processing event network-vif-plugged-a5ad03eb-1959-4b2d-a437-979506e6b988 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.450 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Instance event wait completed in 15 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.475 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935995.4598885, e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.475 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] VM Resumed (Lifecycle Event)#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.476 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.479 187212 INFO nova.virt.libvirt.driver [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Instance spawned successfully.#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.480 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.499 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.503 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.504 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.504 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.504 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.505 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.505 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.510 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.543 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.571 187212 INFO nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Took 60.85 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.572 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.626 187212 INFO nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Took 61.75 seconds to build instance.#033[00m
Dec  5 06:59:55 np0005546909 nova_compute[187208]: 2025-12-05 11:59:55.642 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 61.834s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:56 np0005546909 nova_compute[187208]: 2025-12-05 11:59:56.756 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:56 np0005546909 nova_compute[187208]: 2025-12-05 11:59:56.756 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:56 np0005546909 nova_compute[187208]: 2025-12-05 11:59:56.775 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 06:59:56 np0005546909 nova_compute[187208]: 2025-12-05 11:59:56.898 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:56 np0005546909 nova_compute[187208]: 2025-12-05 11:59:56.899 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:56 np0005546909 nova_compute[187208]: 2025-12-05 11:59:56.905 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 06:59:56 np0005546909 nova_compute[187208]: 2025-12-05 11:59:56.906 187212 INFO nova.compute.claims [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.153 187212 DEBUG nova.compute.provider_tree [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.172 187212 DEBUG nova.scheduler.client.report [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.177 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.200 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.201 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.253 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.253 187212 DEBUG nova.network.neutron [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.280 187212 INFO nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.299 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.388 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.389 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.390 187212 INFO nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating image(s)#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.391 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.391 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.392 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.407 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.486 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.489 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.491 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.504 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.582 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.584 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.612 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.618 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.619 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.619 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.676 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.678 187212 DEBUG nova.virt.disk.api [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Checking if we can resize image /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.678 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.709 187212 DEBUG nova.policy [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.732 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.733 187212 DEBUG nova.virt.disk.api [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Cannot resize image /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.734 187212 DEBUG nova.objects.instance [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'migration_context' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.754 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.755 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Ensure instance console log exists: /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.755 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.756 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:57 np0005546909 nova_compute[187208]: 2025-12-05 11:59:57.756 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.109 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Received event network-vif-plugged-a5ad03eb-1959-4b2d-a437-979506e6b988 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.110 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.110 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.110 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.111 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] No waiting events found dispatching network-vif-plugged-a5ad03eb-1959-4b2d-a437-979506e6b988 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.111 187212 WARNING nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Received unexpected event network-vif-plugged-a5ad03eb-1959-4b2d-a437-979506e6b988 for instance with vm_state active and task_state None.#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.111 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.112 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.112 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.112 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.113 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Processing event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.113 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.114 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.114 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.114 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.114 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] No waiting events found dispatching network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.115 187212 WARNING nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received unexpected event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.115 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.115 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.116 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.116 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.116 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Processing event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.117 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.117 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.117 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.118 187212 DEBUG oslo_concurrency.lockutils [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.118 187212 DEBUG nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] No waiting events found dispatching network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.118 187212 WARNING nova.compute.manager [req-bcb716d4-3d47-4cff-8e39-f184d200c4ec req-e0a6deef-3968-472d-9720-ba3944ee2a15 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received unexpected event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b for instance with vm_state building and task_state spawning.#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.120 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.120 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.125 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935998.1242163, b2e8212c-084c-4a4f-b930-56560ae4da12 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.126 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] VM Resumed (Lifecycle Event)#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.129 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.129 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.134 187212 INFO nova.virt.libvirt.driver [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Instance spawned successfully.#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.134 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.138 187212 INFO nova.virt.libvirt.driver [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Instance spawned successfully.#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.139 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.145 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.155 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.167 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.168 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.168 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.169 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.169 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.170 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.175 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.175 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.176 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.176 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.177 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.177 187212 DEBUG nova.virt.libvirt.driver [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.182 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.183 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764935998.1260316, 04518502-62f1-44c3-8c57-b3404958536f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.183 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] VM Resumed (Lifecycle Event)#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.226 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.228 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 06:59:58 np0005546909 podman[214662]: 2025-12-05 11:59:58.243319803 +0000 UTC m=+0.094887741 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.265 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.271 187212 INFO nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Took 62.80 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.272 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.279 187212 INFO nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Took 63.14 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.279 187212 DEBUG nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.363 187212 INFO nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Took 64.37 seconds to build instance.#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.371 187212 INFO nova.compute.manager [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Took 64.37 seconds to build instance.#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.386 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 64.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:58 np0005546909 nova_compute[187208]: 2025-12-05 11:59:58.388 187212 DEBUG oslo_concurrency.lockutils [None req-b3850357-01d9-4e04-83ea-340d25018b45 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 64.496s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 06:59:59 np0005546909 nova_compute[187208]: 2025-12-05 11:59:59.027 187212 DEBUG nova.network.neutron [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Successfully created port: 380c99a7-9480-45f8-b2f4-adfcdfa8576d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:00:00 np0005546909 nova_compute[187208]: 2025-12-05 12:00:00.065 187212 DEBUG nova.network.neutron [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Successfully updated port: 380c99a7-9480-45f8-b2f4-adfcdfa8576d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:00:00 np0005546909 nova_compute[187208]: 2025-12-05 12:00:00.084 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:00 np0005546909 nova_compute[187208]: 2025-12-05 12:00:00.085 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquired lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:00 np0005546909 nova_compute[187208]: 2025-12-05 12:00:00.085 187212 DEBUG nova.network.neutron [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:00:00 np0005546909 nova_compute[187208]: 2025-12-05 12:00:00.601 187212 DEBUG nova.network.neutron [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.063 187212 DEBUG nova.compute.manager [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-changed-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.063 187212 DEBUG nova.compute.manager [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Refreshing instance network info cache due to event network-changed-380c99a7-9480-45f8-b2f4-adfcdfa8576d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.064 187212 DEBUG oslo_concurrency.lockutils [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.162 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.163 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.185 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.305 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.306 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.316 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.318 187212 INFO nova.compute.claims [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.601 187212 DEBUG nova.compute.provider_tree [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.618 187212 DEBUG nova.scheduler.client.report [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.642 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.643 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.685 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.686 187212 DEBUG nova.network.neutron [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.708 187212 INFO nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.727 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.810 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.812 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.812 187212 INFO nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Creating image(s)#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.813 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "/var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.813 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.814 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.830 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.909 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.910 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.911 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.923 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.987 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:01 np0005546909 nova_compute[187208]: 2025-12-05 12:00:01.988 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.009 187212 DEBUG nova.policy [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.033 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.034 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.035 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.090 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.091 187212 DEBUG nova.virt.disk.api [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Checking if we can resize image /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.093 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.159 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.167 187212 DEBUG nova.virt.disk.api [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Cannot resize image /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.168 187212 DEBUG nova.objects.instance [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'migration_context' on Instance uuid 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.179 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.198 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.200 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Ensure instance console log exists: /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.200 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.201 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.201 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.541 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.542 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.570 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.614 187212 DEBUG nova.network.neutron [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Updating instance_info_cache with network_info: [{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.616 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.634 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Releasing lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.635 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance network_info: |[{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.635 187212 DEBUG oslo_concurrency.lockutils [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.636 187212 DEBUG nova.network.neutron [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Refreshing network info cache for port 380c99a7-9480-45f8-b2f4-adfcdfa8576d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.641 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start _get_guest_xml network_info=[{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.643 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.644 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.652 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.653 187212 INFO nova.compute.claims [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.657 187212 WARNING nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.673 187212 DEBUG nova.virt.libvirt.host [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.676 187212 DEBUG nova.virt.libvirt.host [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.679 187212 DEBUG nova.virt.libvirt.host [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.682 187212 DEBUG nova.virt.libvirt.host [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.682 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.682 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.683 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.683 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.684 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.684 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.684 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.685 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.685 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.685 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.686 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.686 187212 DEBUG nova.virt.hardware [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.690 187212 DEBUG nova.virt.libvirt.vif [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:59:57Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.691 187212 DEBUG nova.network.os_vif_util [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.692 187212 DEBUG nova.network.os_vif_util [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.693 187212 DEBUG nova.objects.instance [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.712 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  <uuid>982a8e69-5181-4847-bdfe-8d4de12bb2e4</uuid>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  <name>instance-0000000c</name>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersAdminTestJSON-server-1785289561</nova:name>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:00:02</nova:creationTime>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:        <nova:user uuid="1ac3c267120a4aeaa91f472943c4e1e2">tempest-ServersAdminTestJSON-715947304-project-member</nova:user>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:        <nova:project uuid="98815fe6b9ea4988abc2cccd9726dc86">tempest-ServersAdminTestJSON-715947304</nova:project>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:        <nova:port uuid="380c99a7-9480-45f8-b2f4-adfcdfa8576d">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <entry name="serial">982a8e69-5181-4847-bdfe-8d4de12bb2e4</entry>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <entry name="uuid">982a8e69-5181-4847-bdfe-8d4de12bb2e4</entry>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:24:4f:38"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <target dev="tap380c99a7-94"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/console.log" append="off"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:00:02 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:00:02 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:00:02 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:00:02 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.718 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Preparing to wait for external event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.718 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.719 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.719 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.720 187212 DEBUG nova.virt.libvirt.vif [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T11:59:57Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.720 187212 DEBUG nova.network.os_vif_util [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.721 187212 DEBUG nova.network.os_vif_util [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.722 187212 DEBUG os_vif [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.723 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.723 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.724 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.732 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.733 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap380c99a7-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.734 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap380c99a7-94, col_values=(('external_ids', {'iface-id': '380c99a7-9480-45f8-b2f4-adfcdfa8576d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:4f:38', 'vm-uuid': '982a8e69-5181-4847-bdfe-8d4de12bb2e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:02 np0005546909 NetworkManager[55691]: <info>  [1764936002.7363] manager: (tap380c99a7-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.735 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.738 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.746 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.748 187212 INFO os_vif [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94')#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.805 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.816 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.817 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No VIF found with MAC fa:16:3e:24:4f:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.817 187212 INFO nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Using config drive#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.821 187212 DEBUG nova.network.neutron [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Successfully created port: f194d74d-a9ec-4838-b35d-8393a2087ec5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.952 187212 DEBUG nova.compute.provider_tree [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.968 187212 DEBUG nova.scheduler.client.report [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.991 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:02 np0005546909 nova_compute[187208]: 2025-12-05 12:00:02.992 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:00:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:03.006 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:03.006 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:03.007 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.055 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.056 187212 DEBUG nova.network.neutron [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.083 187212 INFO nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.105 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.216 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.217 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.218 187212 INFO nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Creating image(s)#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.218 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "/var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.218 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "/var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.219 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "/var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.232 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.291 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.292 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.293 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.304 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.375 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.376 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.459 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk 1073741824" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.461 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.461 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.526 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.527 187212 DEBUG nova.virt.disk.api [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Checking if we can resize image /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.530 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.600 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.602 187212 DEBUG nova.virt.disk.api [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Cannot resize image /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.602 187212 DEBUG nova.objects.instance [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lazy-loading 'migration_context' on Instance uuid 8c58d60e-b997-4eed-8cd4-33ac07d9727a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.626 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.627 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Ensure instance console log exists: /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.627 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.627 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:03 np0005546909 nova_compute[187208]: 2025-12-05 12:00:03.628 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.223 187212 DEBUG nova.network.neutron [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Successfully updated port: f194d74d-a9ec-4838-b35d-8393a2087ec5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.226 187212 INFO nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating config drive at /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.232 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1qu0cyew execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.256 187212 DEBUG nova.network.neutron [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.257 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.261 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.266 187212 WARNING nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.271 187212 DEBUG nova.virt.libvirt.host [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.272 187212 DEBUG nova.virt.libvirt.host [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.276 187212 DEBUG nova.virt.libvirt.host [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.276 187212 DEBUG nova.virt.libvirt.host [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.276 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.277 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.277 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.277 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.278 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.278 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.279 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.279 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.279 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.279 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.280 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.280 187212 DEBUG nova.virt.hardware [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.283 187212 DEBUG nova.objects.instance [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8c58d60e-b997-4eed-8cd4-33ac07d9727a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.286 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "refresh_cache-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.286 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquired lock "refresh_cache-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.287 187212 DEBUG nova.network.neutron [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:00:04 np0005546909 podman[214715]: 2025-12-05 12:00:04.303240396 +0000 UTC m=+0.064444613 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.312 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  <uuid>8c58d60e-b997-4eed-8cd4-33ac07d9727a</uuid>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  <name>instance-0000000e</name>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <nova:name>tempest-LiveMigrationNegativeTest-server-1525054618</nova:name>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:00:04</nova:creationTime>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:00:04 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:        <nova:user uuid="28407300b110465d9748f60fa4ee4945">tempest-LiveMigrationNegativeTest-1771731310-project-member</nova:user>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:        <nova:project uuid="6592a6d983f44d9e94749f0e3e94c689">tempest-LiveMigrationNegativeTest-1771731310</nova:project>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <entry name="serial">8c58d60e-b997-4eed-8cd4-33ac07d9727a</entry>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <entry name="uuid">8c58d60e-b997-4eed-8cd4-33ac07d9727a</entry>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.config"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/console.log" append="off"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:00:04 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:00:04 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:00:04 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:00:04 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.359 187212 DEBUG oslo_concurrency.processutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1qu0cyew" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.373 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.373 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.374 187212 INFO nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Using config drive#033[00m
Dec  5 07:00:04 np0005546909 NetworkManager[55691]: <info>  [1764936004.4055] manager: (tap380c99a7-94): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Dec  5 07:00:04 np0005546909 kernel: tap380c99a7-94: entered promiscuous mode
Dec  5 07:00:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:04Z|00051|binding|INFO|Claiming lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d for this chassis.
Dec  5 07:00:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:04Z|00052|binding|INFO|380c99a7-9480-45f8-b2f4-adfcdfa8576d: Claiming fa:16:3e:24:4f:38 10.100.0.13
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.416 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.424 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:4f:38 10.100.0.13'], port_security=['fa:16:3e:24:4f:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=380c99a7-9480-45f8-b2f4-adfcdfa8576d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.425 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 380c99a7-9480-45f8-b2f4-adfcdfa8576d in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.427 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.443 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3e028fba-1834-42f7-878f-6804fe7cdf62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.444 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24c61e5e-71 in ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.448 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24c61e5e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.448 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f05a197c-ec93-4cca-b7f6-53d8a4a12916]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.449 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fbae72d1-5a58-4cff-96a5-13413b115286]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 systemd-machined[153543]: New machine qemu-12-instance-0000000c.
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.466 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:04 np0005546909 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Dec  5 07:00:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:04Z|00053|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d ovn-installed in OVS
Dec  5 07:00:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:04Z|00054|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d up in Southbound
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.473 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.477 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[2a518ba3-d129-458e-aeb4-5d58c3666f5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 systemd-udevd[214758]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:00:04 np0005546909 NetworkManager[55691]: <info>  [1764936004.4898] device (tap380c99a7-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:00:04 np0005546909 NetworkManager[55691]: <info>  [1764936004.4907] device (tap380c99a7-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.498 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[27b07ad7-d2f4-4487-86ec-bd235d832564]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.540 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5a6f908e-c48d-4c44-8b69-87d70a016fab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.549 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[331f8097-091d-49e9-8c3a-4f72ea8dbcf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 NetworkManager[55691]: <info>  [1764936004.5506] manager: (tap24c61e5e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Dec  5 07:00:04 np0005546909 systemd-udevd[214761]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.590 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f96ae67a-5e47-49f1-a267-636b7c9f700c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.593 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b087cb02-6658-49ca-956c-bc8bc3323fa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.614 187212 DEBUG nova.network.neutron [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:04 np0005546909 NetworkManager[55691]: <info>  [1764936004.6210] device (tap24c61e5e-70): carrier: link connected
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.637 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0e3b0ddc-5d4a-4eeb-a1dd-f7a21685cde2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.672 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0bdda9eb-f2b4-4db6-b61b-2dc5ed5aace9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214789, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.686 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[905b216b-01cc-4e38-98b9-dd0ef67aced1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:ede6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338518, 'tstamp': 338518}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214790, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.701 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e45034cd-7431-4a40-8907-c9f05556a516]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214791, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.731 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[167e386a-3c5e-4e2d-9a93-589280ba971d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.800 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1e556323-eab0-4e1f-afee-7d37489dad07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.802 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.803 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.803 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:04 np0005546909 kernel: tap24c61e5e-70: entered promiscuous mode
Dec  5 07:00:04 np0005546909 NetworkManager[55691]: <info>  [1764936004.8056] manager: (tap24c61e5e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.805 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.810 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:04Z|00055|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.812 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.816 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24c61e5e-7d15-4019-b1bd-d2e253f41aa5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24c61e5e-7d15-4019-b1bd-d2e253f41aa5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.817 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9acecc3f-e731-4485-a9f7-1af549f40b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.818 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/24c61e5e-7d15-4019-b1bd-d2e253f41aa5.pid.haproxy
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 24c61e5e-7d15-4019-b1bd-d2e253f41aa5
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:00:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:04.820 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'env', 'PROCESS_TAG=haproxy-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24c61e5e-7d15-4019-b1bd-d2e253f41aa5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.885 187212 INFO nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Creating config drive at /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.config#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.891 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxlk2ui2b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.991 187212 DEBUG nova.compute.manager [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.992 187212 DEBUG oslo_concurrency.lockutils [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.993 187212 DEBUG oslo_concurrency.lockutils [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.993 187212 DEBUG oslo_concurrency.lockutils [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.994 187212 DEBUG nova.compute.manager [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Processing event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.994 187212 DEBUG nova.compute.manager [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.994 187212 DEBUG oslo_concurrency.lockutils [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.995 187212 DEBUG oslo_concurrency.lockutils [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.996 187212 DEBUG oslo_concurrency.lockutils [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.996 187212 DEBUG nova.compute.manager [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] No waiting events found dispatching network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.996 187212 WARNING nova.compute.manager [req-27b503dc-9a8f-40ff-8ad1-6ae595622882 req-6fe2efd9-2261-42ae-89be-262d15087865 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received unexpected event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:00:04 np0005546909 nova_compute[187208]: 2025-12-05 12:00:04.997 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Instance event wait completed in 10 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.006 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936005.0048325, 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.006 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.009 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.013 187212 INFO nova.virt.libvirt.driver [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Instance spawned successfully.#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.013 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.028 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.033 187212 DEBUG oslo_concurrency.processutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxlk2ui2b" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.054 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.062 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.063 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.063 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.064 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.064 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.064 187212 DEBUG nova.virt.libvirt.driver [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.071 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:05 np0005546909 systemd-machined[153543]: New machine qemu-13-instance-0000000e.
Dec  5 07:00:05 np0005546909 systemd[1]: Started Virtual Machine qemu-13-instance-0000000e.
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.132 187212 INFO nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Took 19.48 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.133 187212 DEBUG nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.199 187212 INFO nova.compute.manager [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Took 20.09 seconds to build instance.#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.217 187212 DEBUG oslo_concurrency.lockutils [None req-5471624e-1a82-4567-8090-cfeae400f30b d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:05 np0005546909 podman[214840]: 2025-12-05 12:00:05.249467618 +0000 UTC m=+0.033085054 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:00:05 np0005546909 podman[214840]: 2025-12-05 12:00:05.38981317 +0000 UTC m=+0.173430586 container create 37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.449 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936005.4494874, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.452 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Started (Lifecycle Event)#033[00m
Dec  5 07:00:05 np0005546909 systemd[1]: Started libpod-conmon-37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0.scope.
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.482 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.488 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936005.4495957, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.489 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:00:05 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.510 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.513 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:05 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38bc886dbbbea2769a63b04a9e8180064790337d80822dc7c2e5b30fc62aed96/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.529 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:05 np0005546909 podman[214840]: 2025-12-05 12:00:05.532337894 +0000 UTC m=+0.315955340 container init 37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  5 07:00:05 np0005546909 podman[214840]: 2025-12-05 12:00:05.538127709 +0000 UTC m=+0.321745125 container start 37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.561 187212 DEBUG nova.compute.manager [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received event network-changed-f194d74d-a9ec-4838-b35d-8393a2087ec5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.561 187212 DEBUG nova.compute.manager [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Refreshing instance network info cache due to event network-changed-f194d74d-a9ec-4838-b35d-8393a2087ec5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.561 187212 DEBUG oslo_concurrency.lockutils [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.629 187212 DEBUG nova.network.neutron [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Updated VIF entry in instance network info cache for port 380c99a7-9480-45f8-b2f4-adfcdfa8576d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.630 187212 DEBUG nova.network.neutron [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Updating instance_info_cache with network_info: [{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.645 187212 DEBUG oslo_concurrency.lockutils [req-88e9512c-ea24-418d-80b2-6ddd05032793 req-8ac2973f-a6d9-48ee-a822-896d34ba82ac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:05 np0005546909 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [NOTICE]   (214873) : New worker (214875) forked
Dec  5 07:00:05 np0005546909 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [NOTICE]   (214873) : Loading success.
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.895 187212 DEBUG nova.network.neutron [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Updating instance_info_cache with network_info: [{"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.920 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Releasing lock "refresh_cache-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.921 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Instance network_info: |[{"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.921 187212 DEBUG oslo_concurrency.lockutils [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.922 187212 DEBUG nova.network.neutron [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Refreshing network info cache for port f194d74d-a9ec-4838-b35d-8393a2087ec5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.924 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Start _get_guest_xml network_info=[{"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.927 187212 WARNING nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.932 187212 DEBUG nova.virt.libvirt.host [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.933 187212 DEBUG nova.virt.libvirt.host [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.939 187212 DEBUG nova.virt.libvirt.host [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.940 187212 DEBUG nova.virt.libvirt.host [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.940 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.940 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.940 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.941 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.941 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.941 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.941 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.941 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.942 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.942 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.942 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.942 187212 DEBUG nova.virt.hardware [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.946 187212 DEBUG nova.virt.libvirt.vif [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:59:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1562123791',display_name='tempest-ServersAdminTestJSON-server-1562123791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1562123791',id=13,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-vj86fqlt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:01Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.946 187212 DEBUG nova.network.os_vif_util [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.947 187212 DEBUG nova.network.os_vif_util [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.948 187212 DEBUG nova.objects.instance [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.984 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  <uuid>3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa</uuid>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  <name>instance-0000000d</name>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersAdminTestJSON-server-1562123791</nova:name>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:00:05</nova:creationTime>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:        <nova:user uuid="1ac3c267120a4aeaa91f472943c4e1e2">tempest-ServersAdminTestJSON-715947304-project-member</nova:user>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:        <nova:project uuid="98815fe6b9ea4988abc2cccd9726dc86">tempest-ServersAdminTestJSON-715947304</nova:project>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:        <nova:port uuid="f194d74d-a9ec-4838-b35d-8393a2087ec5">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <entry name="serial">3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa</entry>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <entry name="uuid">3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa</entry>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.config"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:d0:fa:14"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <target dev="tapf194d74d-a9"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/console.log" append="off"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:00:05 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:00:05 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:00:05 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:00:05 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.985 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Preparing to wait for external event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.985 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.985 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.985 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.986 187212 DEBUG nova.virt.libvirt.vif [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T11:59:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1562123791',display_name='tempest-ServersAdminTestJSON-server-1562123791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1562123791',id=13,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-vj86fqlt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:01Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.986 187212 DEBUG nova.network.os_vif_util [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.987 187212 DEBUG nova.network.os_vif_util [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.987 187212 DEBUG os_vif [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.987 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.988 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.988 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.992 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.992 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf194d74d-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.993 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf194d74d-a9, col_values=(('external_ids', {'iface-id': 'f194d74d-a9ec-4838-b35d-8393a2087ec5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:fa:14', 'vm-uuid': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.994 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:05 np0005546909 NetworkManager[55691]: <info>  [1764936005.9954] manager: (tapf194d74d-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Dec  5 07:00:05 np0005546909 nova_compute[187208]: 2025-12-05 12:00:05.996 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.003 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.004 187212 INFO os_vif [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9')#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.153 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.153 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.153 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No VIF found with MAC fa:16:3e:d0:fa:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.154 187212 INFO nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Using config drive#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.554 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.555 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.555 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936006.5551016, 8c58d60e-b997-4eed-8cd4-33ac07d9727a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.555 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.559 187212 INFO nova.virt.libvirt.driver [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Instance spawned successfully.#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.559 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.602 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.605 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.605 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.606 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.606 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.606 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.607 187212 DEBUG nova.virt.libvirt.driver [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.612 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.643 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.644 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936006.5555263, 8c58d60e-b997-4eed-8cd4-33ac07d9727a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.644 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] VM Started (Lifecycle Event)#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.672 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.674 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.693 187212 INFO nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Took 3.48 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.693 187212 DEBUG nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.701 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.763 187212 INFO nova.compute.manager [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Took 4.14 seconds to build instance.#033[00m
Dec  5 07:00:06 np0005546909 nova_compute[187208]: 2025-12-05 12:00:06.792 187212 DEBUG oslo_concurrency.lockutils [None req-7d6d1830-970f-4c34-8d21-bdb00cd99da6 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:07 np0005546909 nova_compute[187208]: 2025-12-05 12:00:07.223 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:07 np0005546909 nova_compute[187208]: 2025-12-05 12:00:07.491 187212 INFO nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Creating config drive at /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.config#033[00m
Dec  5 07:00:07 np0005546909 nova_compute[187208]: 2025-12-05 12:00:07.496 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxuftx3q6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:07 np0005546909 nova_compute[187208]: 2025-12-05 12:00:07.657 187212 DEBUG oslo_concurrency.processutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxuftx3q6" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:07 np0005546909 kernel: tapf194d74d-a9: entered promiscuous mode
Dec  5 07:00:07 np0005546909 NetworkManager[55691]: <info>  [1764936007.7124] manager: (tapf194d74d-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Dec  5 07:00:07 np0005546909 nova_compute[187208]: 2025-12-05 12:00:07.712 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:07Z|00056|binding|INFO|Claiming lport f194d74d-a9ec-4838-b35d-8393a2087ec5 for this chassis.
Dec  5 07:00:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:07Z|00057|binding|INFO|f194d74d-a9ec-4838-b35d-8393a2087ec5: Claiming fa:16:3e:d0:fa:14 10.100.0.14
Dec  5 07:00:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:07Z|00058|binding|INFO|Setting lport f194d74d-a9ec-4838-b35d-8393a2087ec5 ovn-installed in OVS
Dec  5 07:00:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:07Z|00059|binding|INFO|Setting lport f194d74d-a9ec-4838-b35d-8393a2087ec5 up in Southbound
Dec  5 07:00:07 np0005546909 nova_compute[187208]: 2025-12-05 12:00:07.734 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:07 np0005546909 nova_compute[187208]: 2025-12-05 12:00:07.736 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.734 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:fa:14 10.100.0.14'], port_security=['fa:16:3e:d0:fa:14 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f194d74d-a9ec-4838-b35d-8393a2087ec5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.740 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f194d74d-a9ec-4838-b35d-8393a2087ec5 in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis#033[00m
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.744 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:00:07 np0005546909 systemd-machined[153543]: New machine qemu-14-instance-0000000d.
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.769 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af48df2a-8001-40bc-ad0b-0834be9f8f88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:07 np0005546909 systemd[1]: Started Virtual Machine qemu-14-instance-0000000d.
Dec  5 07:00:07 np0005546909 systemd-udevd[214933]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:00:07 np0005546909 NetworkManager[55691]: <info>  [1764936007.8205] device (tapf194d74d-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:00:07 np0005546909 NetworkManager[55691]: <info>  [1764936007.8212] device (tapf194d74d-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.828 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[671469c2-dce3-4090-9b63-c972349804f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.836 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[57f3c232-cc9c-4cd6-86f9-9d0f3dfe84ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.868 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[17044681-77d0-4026-9d2b-54afebc4b21f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:07 np0005546909 nova_compute[187208]: 2025-12-05 12:00:07.868 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "48f123c5-f925-4f6f-94e5-d109e25ef206" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:07 np0005546909 nova_compute[187208]: 2025-12-05 12:00:07.868 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "48f123c5-f925-4f6f-94e5-d109e25ef206" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:07 np0005546909 nova_compute[187208]: 2025-12-05 12:00:07.889 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.891 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[addf6894-5e10-4e17-8c03-fd5274a67225]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214946, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.913 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[de1b78eb-1e2d-4ca3-94f8-7ed01ff7d616]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214950, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214950, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.915 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:07 np0005546909 nova_compute[187208]: 2025-12-05 12:00:07.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.918 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.919 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.919 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:07.920 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.089 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.089 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.100 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.101 187212 INFO nova.compute.claims [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.299 187212 DEBUG nova.network.neutron [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Updated VIF entry in instance network info cache for port f194d74d-a9ec-4838-b35d-8393a2087ec5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.299 187212 DEBUG nova.network.neutron [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Updating instance_info_cache with network_info: [{"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.317 187212 DEBUG oslo_concurrency.lockutils [req-8198abaf-640e-4133-af84-4ab5f645b8a0 req-b0d33332-4b2f-4907-825e-bf47c846018e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.378 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936008.3781261, 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.378 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] VM Started (Lifecycle Event)#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.407 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.413 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936008.3800335, 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.414 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.434 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.438 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.457 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.471 187212 DEBUG nova.compute.provider_tree [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.493 187212 DEBUG nova.scheduler.client.report [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.519 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.520 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:00:08 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:08Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2b:76:46 10.1.0.55
Dec  5 07:00:08 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:08Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2b:76:46 10.1.0.55
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.722 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.723 187212 DEBUG nova.network.neutron [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.835 187212 DEBUG nova.compute.manager [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.835 187212 DEBUG oslo_concurrency.lockutils [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.836 187212 DEBUG oslo_concurrency.lockutils [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.836 187212 DEBUG oslo_concurrency.lockutils [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.836 187212 DEBUG nova.compute.manager [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Processing event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.837 187212 DEBUG nova.compute.manager [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.837 187212 DEBUG oslo_concurrency.lockutils [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.837 187212 DEBUG oslo_concurrency.lockutils [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.837 187212 DEBUG oslo_concurrency.lockutils [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.838 187212 DEBUG nova.compute.manager [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.838 187212 WARNING nova.compute.manager [req-6be63049-a278-4229-9913-791f2341a0c3 req-7e88177c-3a13-477d-a917-db0508b95314 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.839 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.844 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.846 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936008.8457544, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.846 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.866 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance spawned successfully.#033[00m
Dec  5 07:00:08 np0005546909 nova_compute[187208]: 2025-12-05 12:00:08.866 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.197 187212 INFO nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.204 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.204 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.204 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.204 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.205 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.205 187212 DEBUG nova.virt.libvirt.driver [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.208 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.210 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.233 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.257 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.309 187212 INFO nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Took 11.92 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.309 187212 DEBUG nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.369 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.370 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.371 187212 INFO nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Creating image(s)#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.372 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "/var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.372 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "/var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.372 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "/var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.394 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.422 187212 INFO nova.compute.manager [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Took 12.55 seconds to build instance.#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.443 187212 DEBUG oslo_concurrency.lockutils [None req-6c4126cd-7cf8-4108-8aab-d68d48cf6232 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.512 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.513 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.513 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.524 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.547 187212 DEBUG nova.network.neutron [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.547 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.601 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.601 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.653 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.655 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.655 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.733 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.734 187212 DEBUG nova.virt.disk.api [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Checking if we can resize image /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.735 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.829 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.830 187212 DEBUG nova.virt.disk.api [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Cannot resize image /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.831 187212 DEBUG nova.objects.instance [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lazy-loading 'migration_context' on Instance uuid 48f123c5-f925-4f6f-94e5-d109e25ef206 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.845 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.846 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Ensure instance console log exists: /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.846 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.846 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.847 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.848 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.853 187212 WARNING nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.863 187212 DEBUG nova.virt.libvirt.host [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.864 187212 DEBUG nova.virt.libvirt.host [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.869 187212 DEBUG nova.virt.libvirt.host [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.870 187212 DEBUG nova.virt.libvirt.host [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.871 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.871 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.872 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.872 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.872 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.872 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.872 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.873 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.873 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.873 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.873 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.873 187212 DEBUG nova.virt.hardware [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.878 187212 DEBUG nova.objects.instance [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lazy-loading 'pci_devices' on Instance uuid 48f123c5-f925-4f6f-94e5-d109e25ef206 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.897 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  <uuid>48f123c5-f925-4f6f-94e5-d109e25ef206</uuid>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  <name>instance-0000000f</name>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerDiagnosticsTest-server-464968639</nova:name>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:00:09</nova:creationTime>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:00:09 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:        <nova:user uuid="b90e703b69ae4296bdb7708c3a32bb96">tempest-ServerDiagnosticsTest-765338568-project-member</nova:user>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:        <nova:project uuid="be4874dd3f38484aa6f1bf8ba69c451f">tempest-ServerDiagnosticsTest-765338568</nova:project>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <entry name="serial">48f123c5-f925-4f6f-94e5-d109e25ef206</entry>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <entry name="uuid">48f123c5-f925-4f6f-94e5-d109e25ef206</entry>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.config"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/console.log" append="off"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:00:09 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:00:09 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:00:09 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:00:09 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.941 187212 DEBUG nova.objects.instance [None req-a771051f-9d8e-46d7-8a79-60d4e88701d7 f7c1f6297b534089b496cf7a88d8731e 8dd78283a39d4967be13c14c9c55054a - - default default] Lazy-loading 'pci_devices' on Instance uuid 8c58d60e-b997-4eed-8cd4-33ac07d9727a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.965 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.965 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.966 187212 INFO nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Using config drive#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.981 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936009.9808989, 8c58d60e-b997-4eed-8cd4-33ac07d9727a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.981 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:00:09 np0005546909 nova_compute[187208]: 2025-12-05 12:00:09.997 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.003 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.020 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.395 187212 INFO nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Creating config drive at /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.config#033[00m
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.401 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp44jfpx8h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.532 187212 DEBUG oslo_concurrency.processutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp44jfpx8h" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:10 np0005546909 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Dec  5 07:00:10 np0005546909 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000e.scope: Consumed 4.660s CPU time.
Dec  5 07:00:10 np0005546909 systemd-machined[153543]: Machine qemu-13-instance-0000000e terminated.
Dec  5 07:00:10 np0005546909 systemd-machined[153543]: New machine qemu-15-instance-0000000f.
Dec  5 07:00:10 np0005546909 systemd[1]: Started Virtual Machine qemu-15-instance-0000000f.
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.760 187212 DEBUG nova.compute.manager [None req-a771051f-9d8e-46d7-8a79-60d4e88701d7 f7c1f6297b534089b496cf7a88d8731e 8dd78283a39d4967be13c14c9c55054a - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.983 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936010.9827046, 48f123c5-f925-4f6f-94e5-d109e25ef206 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.984 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.986 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.986 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.990 187212 INFO nova.virt.libvirt.driver [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Instance spawned successfully.#033[00m
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.990 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:00:10 np0005546909 nova_compute[187208]: 2025-12-05 12:00:10.996 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.034 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.034 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.035 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.035 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.035 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.036 187212 DEBUG nova.virt.libvirt.driver [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.039 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.043 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.086 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.086 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936010.9844227, 48f123c5-f925-4f6f-94e5-d109e25ef206 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.086 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] VM Started (Lifecycle Event)#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.133 187212 INFO nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Took 1.76 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.133 187212 DEBUG nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.144 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.147 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.171 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.199 187212 INFO nova.compute.manager [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Took 3.15 seconds to build instance.#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.216 187212 DEBUG oslo_concurrency.lockutils [None req-774033ab-2a6b-4760-9d50-47b5fe5cee6c b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "48f123c5-f925-4f6f-94e5-d109e25ef206" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.686 187212 DEBUG nova.compute.manager [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.686 187212 DEBUG oslo_concurrency.lockutils [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.686 187212 DEBUG oslo_concurrency.lockutils [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.687 187212 DEBUG oslo_concurrency.lockutils [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.687 187212 DEBUG nova.compute.manager [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Processing event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.687 187212 DEBUG nova.compute.manager [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.687 187212 DEBUG oslo_concurrency.lockutils [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.688 187212 DEBUG oslo_concurrency.lockutils [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.688 187212 DEBUG oslo_concurrency.lockutils [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.688 187212 DEBUG nova.compute.manager [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] No waiting events found dispatching network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.688 187212 WARNING nova.compute.manager [req-9e10c0e4-c0c6-4348-912d-232909b782f5 req-a1d6f2cd-fcce-4f71-a091-2794fe5b1d43 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received unexpected event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.689 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.698 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936011.6981196, 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.698 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.701 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.705 187212 INFO nova.virt.libvirt.driver [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Instance spawned successfully.#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.706 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.723 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.729 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.732 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.732 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.732 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.733 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.733 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.734 187212 DEBUG nova.virt.libvirt.driver [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.781 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.831 187212 INFO nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Took 10.02 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.832 187212 DEBUG nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.901 187212 INFO nova.compute.manager [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Took 10.65 seconds to build instance.#033[00m
Dec  5 07:00:11 np0005546909 nova_compute[187208]: 2025-12-05 12:00:11.923 187212 DEBUG oslo_concurrency.lockutils [None req-b7e3340f-089f-46ba-b12d-490b5c786121 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.225 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.236 187212 DEBUG nova.compute.manager [None req-b3dadc14-99f6-48a1-b77e-867c9113c725 0a7c1fec28ba47a491ffab0046222160 4b63a617d21d4836b40a81129fab3990 - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.239 187212 INFO nova.compute.manager [None req-b3dadc14-99f6-48a1-b77e-867c9113c725 0a7c1fec28ba47a491ffab0046222160 4b63a617d21d4836b40a81129fab3990 - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Retrieving diagnostics#033[00m
Dec  5 07:00:12 np0005546909 podman[215054]: 2025-12-05 12:00:12.249633016 +0000 UTC m=+0.100046854 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:00:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:12Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:61:58:b9 10.1.0.8
Dec  5 07:00:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:12Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:61:58:b9 10.1.0.8
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.618 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.619 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.619 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.619 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.619 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.621 187212 INFO nova.compute.manager [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Terminating instance#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.621 187212 DEBUG nova.compute.manager [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:00:12 np0005546909 kernel: tapa5ad03eb-19 (unregistering): left promiscuous mode
Dec  5 07:00:12 np0005546909 NetworkManager[55691]: <info>  [1764936012.6497] device (tapa5ad03eb-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:00:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:12Z|00060|binding|INFO|Releasing lport a5ad03eb-1959-4b2d-a437-979506e6b988 from this chassis (sb_readonly=0)
Dec  5 07:00:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:12Z|00061|binding|INFO|Setting lport a5ad03eb-1959-4b2d-a437-979506e6b988 down in Southbound
Dec  5 07:00:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:12Z|00062|binding|INFO|Removing iface tapa5ad03eb-19 ovn-installed in OVS
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.660 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.668 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.668 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:76:46 10.1.0.55 fdfe:381f:8400::38b'], port_security=['fa:16:3e:2b:76:46 10.1.0.55 fdfe:381f:8400::38b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.55/26 fdfe:381f:8400::38b/64', 'neutron:device_id': 'e83b5d7d-04a7-44d9-a6fe-580f1cfa5838', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca5a0748-2268-4f31-a673-9ef2606c4273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d28d43c-0f17-4a95-87c9-620fe47e764a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a93d84c4-2884-48aa-b436-9baea579d840, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=a5ad03eb-1959-4b2d-a437-979506e6b988) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.670 104471 INFO neutron.agent.ovn.metadata.agent [-] Port a5ad03eb-1959-4b2d-a437-979506e6b988 in datapath ca5a0748-2268-4f31-a673-9ef2606c4273 unbound from our chassis#033[00m
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.672 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca5a0748-2268-4f31-a673-9ef2606c4273#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.675 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.688 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eca578b3-1caf-4b90-9e12-c6bbe4ba22cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:12 np0005546909 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000003.scope: Deactivated successfully.
Dec  5 07:00:12 np0005546909 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000003.scope: Consumed 14.231s CPU time.
Dec  5 07:00:12 np0005546909 systemd-machined[153543]: Machine qemu-7-instance-00000003 terminated.
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.733 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[202e0d91-9d29-44d1-b86e-1784ff4561a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.741 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[35362dd8-27cd-4e49-83af-ddbf06841efa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:12Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:05:76:3a 10.1.0.6
Dec  5 07:00:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:12Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:05:76:3a 10.1.0.6
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.768 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[92f20bcb-b19a-4952-bb54-bac456da9091]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.786 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fa08e6d0-895a-4af4-99c9-718bc1c56253]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca5a0748-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:49:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 18, 'tx_packets': 9, 'rx_bytes': 1580, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 18, 'tx_packets': 9, 'rx_bytes': 1580, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336387, 'reachable_time': 27307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 18, 'inoctets': 1328, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 18, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1328, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 18, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215083, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.801 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[db2a05b2-f251-47bf-8735-e9809a7b047b]: (4, ({'family': 2, 'prefixlen': 26, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.0.2'], ['IFA_LOCAL', '10.1.0.2'], ['IFA_BROADCAST', '10.1.0.63'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336400, 'tstamp': 336400}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215085, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336403, 'tstamp': 336403}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215085, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.803 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca5a0748-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.806 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.810 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca5a0748-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.811 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.811 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca5a0748-20, col_values=(('external_ids', {'iface-id': '4248cb8a-d980-4682-8c47-d6faac0a26bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:12.812 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.845 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.889 187212 INFO nova.virt.libvirt.driver [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Instance destroyed successfully.#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.889 187212 DEBUG nova.objects.instance [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'resources' on Instance uuid e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.901 187212 DEBUG nova.virt.libvirt.vif [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-1',id=3,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T11:59:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T11:59:55Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=e83b5d7d-04a7-44d9-a6fe-580f1cfa5838,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.901 187212 DEBUG nova.network.os_vif_util [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "a5ad03eb-1959-4b2d-a437-979506e6b988", "address": "fa:16:3e:2b:76:46", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::38b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5ad03eb-19", "ovs_interfaceid": "a5ad03eb-1959-4b2d-a437-979506e6b988", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.902 187212 DEBUG nova.network.os_vif_util [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.903 187212 DEBUG os_vif [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.905 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.906 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa5ad03eb-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.910 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.912 187212 INFO os_vif [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:76:46,bridge_name='br-int',has_traffic_filtering=True,id=a5ad03eb-1959-4b2d-a437-979506e6b988,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5ad03eb-19')#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.912 187212 INFO nova.virt.libvirt.driver [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Deleting instance files /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838_del#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.913 187212 INFO nova.virt.libvirt.driver [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Deletion of /var/lib/nova/instances/e83b5d7d-04a7-44d9-a6fe-580f1cfa5838_del complete#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.966 187212 INFO nova.compute.manager [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.966 187212 DEBUG oslo.service.loopingcall [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.967 187212 DEBUG nova.compute.manager [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:00:12 np0005546909 nova_compute[187208]: 2025-12-05 12:00:12.967 187212 DEBUG nova.network.neutron [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.114 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "48f123c5-f925-4f6f-94e5-d109e25ef206" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.115 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "48f123c5-f925-4f6f-94e5-d109e25ef206" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.115 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "48f123c5-f925-4f6f-94e5-d109e25ef206-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.115 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "48f123c5-f925-4f6f-94e5-d109e25ef206-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.116 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "48f123c5-f925-4f6f-94e5-d109e25ef206-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.117 187212 INFO nova.compute.manager [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Terminating instance#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.118 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "refresh_cache-48f123c5-f925-4f6f-94e5-d109e25ef206" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.118 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquired lock "refresh_cache-48f123c5-f925-4f6f-94e5-d109e25ef206" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.118 187212 DEBUG nova.network.neutron [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.485 187212 DEBUG nova.network.neutron [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.776 187212 DEBUG nova.network.neutron [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.793 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Releasing lock "refresh_cache-48f123c5-f925-4f6f-94e5-d109e25ef206" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.794 187212 DEBUG nova.compute.manager [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:00:13 np0005546909 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Dec  5 07:00:13 np0005546909 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000000f.scope: Consumed 3.148s CPU time.
Dec  5 07:00:13 np0005546909 systemd-machined[153543]: Machine qemu-15-instance-0000000f terminated.
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.938 187212 DEBUG nova.network.neutron [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:13 np0005546909 nova_compute[187208]: 2025-12-05 12:00:13.958 187212 INFO nova.compute.manager [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Took 0.99 seconds to deallocate network for instance.#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.011 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.015 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.045 187212 INFO nova.virt.libvirt.driver [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Instance destroyed successfully.#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.046 187212 DEBUG nova.objects.instance [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lazy-loading 'resources' on Instance uuid 48f123c5-f925-4f6f-94e5-d109e25ef206 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.066 187212 INFO nova.virt.libvirt.driver [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Deleting instance files /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206_del#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.067 187212 INFO nova.virt.libvirt.driver [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Deletion of /var/lib/nova/instances/48f123c5-f925-4f6f-94e5-d109e25ef206_del complete#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.122 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:14 np0005546909 NetworkManager[55691]: <info>  [1764936014.1244] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/37)
Dec  5 07:00:14 np0005546909 NetworkManager[55691]: <info>  [1764936014.1254] device (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:00:14 np0005546909 NetworkManager[55691]: <info>  [1764936014.1264] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/38)
Dec  5 07:00:14 np0005546909 NetworkManager[55691]: <info>  [1764936014.1267] device (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  5 07:00:14 np0005546909 NetworkManager[55691]: <info>  [1764936014.1274] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Dec  5 07:00:14 np0005546909 NetworkManager[55691]: <info>  [1764936014.1280] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Dec  5 07:00:14 np0005546909 NetworkManager[55691]: <info>  [1764936014.1284] device (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  5 07:00:14 np0005546909 NetworkManager[55691]: <info>  [1764936014.1286] device (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.139 187212 INFO nova.compute.manager [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.139 187212 DEBUG oslo.service.loopingcall [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.140 187212 DEBUG nova.compute.manager [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.140 187212 DEBUG nova.network.neutron [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:14Z|00063|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec  5 07:00:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:14Z|00064|binding|INFO|Releasing lport 4248cb8a-d980-4682-8c47-d6faac0a26bc from this chassis (sb_readonly=0)
Dec  5 07:00:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:14Z|00065|binding|INFO|Releasing lport 79bf1a96-6e90-41b7-8356-9756185de59f from this chassis (sb_readonly=0)
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.201 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.292 187212 DEBUG nova.compute.provider_tree [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.308 187212 DEBUG nova.scheduler.client.report [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.343 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.372 187212 INFO nova.scheduler.client.report [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Deleted allocations for instance e83b5d7d-04a7-44d9-a6fe-580f1cfa5838#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.391 187212 DEBUG nova.network.neutron [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.423 187212 DEBUG nova.network.neutron [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.451 187212 INFO nova.compute.manager [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Took 0.31 seconds to deallocate network for instance.#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.458 187212 DEBUG oslo_concurrency.lockutils [None req-4bcd372d-dc14-4a6e-9616-50416cd715b2 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "e83b5d7d-04a7-44d9-a6fe-580f1cfa5838" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.839s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.495 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.496 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.701 187212 DEBUG nova.compute.provider_tree [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.718 187212 DEBUG nova.scheduler.client.report [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.750 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.780 187212 INFO nova.scheduler.client.report [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Deleted allocations for instance 48f123c5-f925-4f6f-94e5-d109e25ef206#033[00m
Dec  5 07:00:14 np0005546909 nova_compute[187208]: 2025-12-05 12:00:14.857 187212 DEBUG oslo_concurrency.lockutils [None req-4dbdacf5-931f-4933-8b7c-e232490ec44f b90e703b69ae4296bdb7708c3a32bb96 be4874dd3f38484aa6f1bf8ba69c451f - - default default] Lock "48f123c5-f925-4f6f-94e5-d109e25ef206" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:15 np0005546909 nova_compute[187208]: 2025-12-05 12:00:15.890 187212 DEBUG nova.compute.manager [req-4d496ded-9450-40f6-9b12-d759fb051ee8 req-0deaac7f-360a-4128-9959-7bc942d1ed29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Received event network-vif-deleted-a5ad03eb-1959-4b2d-a437-979506e6b988 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:16 np0005546909 nova_compute[187208]: 2025-12-05 12:00:16.708 187212 DEBUG nova.compute.manager [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-changed-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:16 np0005546909 nova_compute[187208]: 2025-12-05 12:00:16.708 187212 DEBUG nova.compute.manager [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Refreshing instance network info cache due to event network-changed-9275d01b-3eb9-429b-a0ba-0cb60048987a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:00:16 np0005546909 nova_compute[187208]: 2025-12-05 12:00:16.708 187212 DEBUG oslo_concurrency.lockutils [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:16 np0005546909 nova_compute[187208]: 2025-12-05 12:00:16.709 187212 DEBUG oslo_concurrency.lockutils [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:16 np0005546909 nova_compute[187208]: 2025-12-05 12:00:16.709 187212 DEBUG nova.network.neutron [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Refreshing network info cache for port 9275d01b-3eb9-429b-a0ba-0cb60048987a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.214 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.214 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.215 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.215 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.215 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.216 187212 INFO nova.compute.manager [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Terminating instance#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.217 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "refresh_cache-8c58d60e-b997-4eed-8cd4-33ac07d9727a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.218 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquired lock "refresh_cache-8c58d60e-b997-4eed-8cd4-33ac07d9727a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.218 187212 DEBUG nova.network.neutron [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:00:17 np0005546909 podman[215112]: 2025-12-05 12:00:17.219107922 +0000 UTC m=+0.061901286 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.227 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.365 187212 DEBUG nova.network.neutron [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.769 187212 DEBUG nova.network.neutron [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.793 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Releasing lock "refresh_cache-8c58d60e-b997-4eed-8cd4-33ac07d9727a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.794 187212 DEBUG nova.compute.manager [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.801 187212 INFO nova.virt.libvirt.driver [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Instance destroyed successfully.#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.801 187212 DEBUG nova.objects.instance [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lazy-loading 'resources' on Instance uuid 8c58d60e-b997-4eed-8cd4-33ac07d9727a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.814 187212 INFO nova.virt.libvirt.driver [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Deleting instance files /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a_del#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.815 187212 INFO nova.virt.libvirt.driver [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Deletion of /var/lib/nova/instances/8c58d60e-b997-4eed-8cd4-33ac07d9727a_del complete#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.909 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.930 187212 INFO nova.compute.manager [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Took 0.14 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.931 187212 DEBUG oslo.service.loopingcall [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.931 187212 DEBUG nova.compute.manager [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:00:17 np0005546909 nova_compute[187208]: 2025-12-05 12:00:17.932 187212 DEBUG nova.network.neutron [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.105 187212 DEBUG nova.network.neutron [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.359 187212 DEBUG nova.network.neutron [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.372 187212 INFO nova.compute.manager [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Took 0.44 seconds to deallocate network for instance.#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.417 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.418 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.610 187212 DEBUG nova.compute.provider_tree [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.684 187212 DEBUG nova.scheduler.client.report [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.711 187212 DEBUG nova.network.neutron [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updated VIF entry in instance network info cache for port 9275d01b-3eb9-429b-a0ba-0cb60048987a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.711 187212 DEBUG nova.network.neutron [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updating instance_info_cache with network_info: [{"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.218", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.752 187212 DEBUG oslo_concurrency.lockutils [req-2e546691-901a-43fd-8e94-14214900bd40 req-966ba32a-e76b-4b6e-984b-2665f93f5e67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.755 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.759 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.759 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.760 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.760 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.760 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.762 187212 INFO nova.compute.manager [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Terminating instance#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.763 187212 DEBUG nova.compute.manager [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:00:18 np0005546909 kernel: tap06886ab7-aa (unregistering): left promiscuous mode
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.790 187212 INFO nova.scheduler.client.report [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Deleted allocations for instance 8c58d60e-b997-4eed-8cd4-33ac07d9727a#033[00m
Dec  5 07:00:18 np0005546909 NetworkManager[55691]: <info>  [1764936018.8086] device (tap06886ab7-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:00:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:18Z|00066|binding|INFO|Releasing lport 06886ab7-aa74-4f44-b509-94e27d585818 from this chassis (sb_readonly=0)
Dec  5 07:00:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:18Z|00067|binding|INFO|Setting lport 06886ab7-aa74-4f44-b509-94e27d585818 down in Southbound
Dec  5 07:00:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:18Z|00068|binding|INFO|Removing iface tap06886ab7-aa ovn-installed in OVS
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.815 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.827 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:58:b9 10.1.0.8 fdfe:381f:8400::241'], port_security=['fa:16:3e:61:58:b9 10.1.0.8 fdfe:381f:8400::241'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.8/26 fdfe:381f:8400::241/64', 'neutron:device_id': '04518502-62f1-44c3-8c57-b3404958536f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca5a0748-2268-4f31-a673-9ef2606c4273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d28d43c-0f17-4a95-87c9-620fe47e764a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a93d84c4-2884-48aa-b436-9baea579d840, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=06886ab7-aa74-4f44-b509-94e27d585818) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.828 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 06886ab7-aa74-4f44-b509-94e27d585818 in datapath ca5a0748-2268-4f31-a673-9ef2606c4273 unbound from our chassis#033[00m
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.830 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca5a0748-2268-4f31-a673-9ef2606c4273#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.831 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.855 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7dd5a96d-4f2d-4e7f-bc5f-1bb3b89b1ea2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.866 187212 DEBUG oslo_concurrency.lockutils [None req-94f21bad-e848-418e-a28c-77984e997552 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "8c58d60e-b997-4eed-8cd4-33ac07d9727a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:18 np0005546909 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000005.scope: Deactivated successfully.
Dec  5 07:00:18 np0005546909 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000005.scope: Consumed 13.034s CPU time.
Dec  5 07:00:18 np0005546909 systemd-machined[153543]: Machine qemu-8-instance-00000005 terminated.
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.896 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[346b9479-b062-49a9-933d-2c9d85c1ecf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.901 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8a8485-ac1e-44a1-80cb-71d6b95b3f7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:18 np0005546909 podman[215152]: 2025-12-05 12:00:18.931000644 +0000 UTC m=+0.089342869 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=edpm, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public)
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.936 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce6a7aa-5d58-4d0d-85d8-4cc3df3aace9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.953 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[718e748f-3adf-4b8e-9af7-3656beb3f0a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca5a0748-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:49:b1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 11, 'rx_bytes': 2304, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 11, 'rx_bytes': 2304, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336387, 'reachable_time': 27307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 26, 'inoctets': 1856, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 26, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1856, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 26, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215181, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.974 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e22423eb-6896-4b83-8b46-299274ef57bb]: (4, ({'family': 2, 'prefixlen': 26, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.1.0.2'], ['IFA_LOCAL', '10.1.0.2'], ['IFA_BROADCAST', '10.1.0.63'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336400, 'tstamp': 336400}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215182, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapca5a0748-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336403, 'tstamp': 336403}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215182, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.976 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca5a0748-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.977 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:18 np0005546909 nova_compute[187208]: 2025-12-05 12:00:18.981 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.982 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca5a0748-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.982 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.982 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca5a0748-20, col_values=(('external_ids', {'iface-id': '4248cb8a-d980-4682-8c47-d6faac0a26bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:18.983 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.041 187212 INFO nova.virt.libvirt.driver [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Instance destroyed successfully.#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.041 187212 DEBUG nova.objects.instance [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'resources' on Instance uuid 04518502-62f1-44c3-8c57-b3404958536f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.092 187212 DEBUG nova.virt.libvirt.vif [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-2',id=5,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-12-05T11:59:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T11:59:58Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=04518502-62f1-44c3-8c57-b3404958536f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.093 187212 DEBUG nova.network.os_vif_util [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "06886ab7-aa74-4f44-b509-94e27d585818", "address": "fa:16:3e:61:58:b9", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::241", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06886ab7-aa", "ovs_interfaceid": "06886ab7-aa74-4f44-b509-94e27d585818", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.094 187212 DEBUG nova.network.os_vif_util [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.094 187212 DEBUG os_vif [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.095 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.096 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06886ab7-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.098 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.102 187212 INFO os_vif [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:58:b9,bridge_name='br-int',has_traffic_filtering=True,id=06886ab7-aa74-4f44-b509-94e27d585818,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06886ab7-aa')#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.103 187212 INFO nova.virt.libvirt.driver [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Deleting instance files /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f_del#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.103 187212 INFO nova.virt.libvirt.driver [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Deletion of /var/lib/nova/instances/04518502-62f1-44c3-8c57-b3404958536f_del complete#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.169 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.170 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.189 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.195 187212 INFO nova.compute.manager [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.196 187212 DEBUG oslo.service.loopingcall [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.196 187212 DEBUG nova.compute.manager [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.196 187212 DEBUG nova.network.neutron [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.294 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.294 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.302 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.302 187212 INFO nova.compute.claims [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.540 187212 DEBUG nova.compute.provider_tree [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.556 187212 DEBUG nova.scheduler.client.report [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.578 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.579 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:00:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:19Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f5:93:9d 10.100.0.7
Dec  5 07:00:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:19Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f5:93:9d 10.100.0.7
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.621 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.622 187212 DEBUG nova.network.neutron [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.643 187212 INFO nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.660 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.746 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.748 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.749 187212 INFO nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Creating image(s)#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.751 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "/var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.753 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.754 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.774 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.844 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.845 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.846 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.860 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.883 187212 DEBUG nova.policy [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.921 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.922 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.956 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.957 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:19 np0005546909 nova_compute[187208]: 2025-12-05 12:00:19.958 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.019 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.020 187212 DEBUG nova.virt.disk.api [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Checking if we can resize image /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.021 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.078 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.079 187212 DEBUG nova.virt.disk.api [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Cannot resize image /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.079 187212 DEBUG nova.objects.instance [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'migration_context' on Instance uuid 4e7aec76-673e-48b5-b183-cc9c7a95fd37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.093 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.094 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Ensure instance console log exists: /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.095 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.097 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.097 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.518 187212 DEBUG nova.network.neutron [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Successfully created port: 75a214ef-2b9f-4c81-bdad-de5791244b85 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.546 187212 DEBUG nova.network.neutron [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.567 187212 INFO nova.compute.manager [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Took 1.37 seconds to deallocate network for instance.#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.582 187212 DEBUG nova.compute.manager [req-0a740a59-8936-403c-9368-cd6dafee4ddc req-71491d06-ebe7-49f8-bf01-e9d582ac2c03 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-vif-unplugged-06886ab7-aa74-4f44-b509-94e27d585818 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.583 187212 DEBUG oslo_concurrency.lockutils [req-0a740a59-8936-403c-9368-cd6dafee4ddc req-71491d06-ebe7-49f8-bf01-e9d582ac2c03 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.583 187212 DEBUG oslo_concurrency.lockutils [req-0a740a59-8936-403c-9368-cd6dafee4ddc req-71491d06-ebe7-49f8-bf01-e9d582ac2c03 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.583 187212 DEBUG oslo_concurrency.lockutils [req-0a740a59-8936-403c-9368-cd6dafee4ddc req-71491d06-ebe7-49f8-bf01-e9d582ac2c03 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.583 187212 DEBUG nova.compute.manager [req-0a740a59-8936-403c-9368-cd6dafee4ddc req-71491d06-ebe7-49f8-bf01-e9d582ac2c03 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] No waiting events found dispatching network-vif-unplugged-06886ab7-aa74-4f44-b509-94e27d585818 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.584 187212 DEBUG nova.compute.manager [req-0a740a59-8936-403c-9368-cd6dafee4ddc req-71491d06-ebe7-49f8-bf01-e9d582ac2c03 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-vif-unplugged-06886ab7-aa74-4f44-b509-94e27d585818 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.620 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.621 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.779 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.780 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.780 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.780 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.780 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.782 187212 INFO nova.compute.manager [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Terminating instance#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.783 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "refresh_cache-5150eaf5-c0ca-48ab-9045-af5a1c785c8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.783 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquired lock "refresh_cache-5150eaf5-c0ca-48ab-9045-af5a1c785c8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.783 187212 DEBUG nova.network.neutron [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.819 187212 DEBUG nova.compute.provider_tree [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.836 187212 DEBUG nova.scheduler.client.report [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:20 np0005546909 nova_compute[187208]: 2025-12-05 12:00:20.857 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:21 np0005546909 nova_compute[187208]: 2025-12-05 12:00:21.052 187212 DEBUG nova.network.neutron [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:21 np0005546909 nova_compute[187208]: 2025-12-05 12:00:21.123 187212 INFO nova.scheduler.client.report [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Deleted allocations for instance 04518502-62f1-44c3-8c57-b3404958536f#033[00m
Dec  5 07:00:21 np0005546909 nova_compute[187208]: 2025-12-05 12:00:21.381 187212 DEBUG oslo_concurrency.lockutils [None req-e1106271-5f67-43ba-8f0f-8ee908692055 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.622s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:21 np0005546909 nova_compute[187208]: 2025-12-05 12:00:21.632 187212 DEBUG nova.network.neutron [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:21 np0005546909 nova_compute[187208]: 2025-12-05 12:00:21.652 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Releasing lock "refresh_cache-5150eaf5-c0ca-48ab-9045-af5a1c785c8e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:21 np0005546909 nova_compute[187208]: 2025-12-05 12:00:21.653 187212 DEBUG nova.compute.manager [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:00:21 np0005546909 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Dec  5 07:00:21 np0005546909 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000b.scope: Consumed 14.535s CPU time.
Dec  5 07:00:21 np0005546909 systemd-machined[153543]: Machine qemu-10-instance-0000000b terminated.
Dec  5 07:00:21 np0005546909 nova_compute[187208]: 2025-12-05 12:00:21.893 187212 INFO nova.virt.libvirt.driver [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Instance destroyed successfully.#033[00m
Dec  5 07:00:21 np0005546909 nova_compute[187208]: 2025-12-05 12:00:21.894 187212 DEBUG nova.objects.instance [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lazy-loading 'resources' on Instance uuid 5150eaf5-c0ca-48ab-9045-af5a1c785c8e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:21 np0005546909 nova_compute[187208]: 2025-12-05 12:00:21.990 187212 DEBUG nova.network.neutron [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Successfully updated port: 75a214ef-2b9f-4c81-bdad-de5791244b85 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.019 187212 INFO nova.virt.libvirt.driver [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Deleting instance files /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e_del#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.020 187212 INFO nova.virt.libvirt.driver [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Deletion of /var/lib/nova/instances/5150eaf5-c0ca-48ab-9045-af5a1c785c8e_del complete#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.035 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "refresh_cache-4e7aec76-673e-48b5-b183-cc9c7a95fd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.036 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquired lock "refresh_cache-4e7aec76-673e-48b5-b183-cc9c7a95fd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.036 187212 DEBUG nova.network.neutron [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.163 187212 INFO nova.compute.manager [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Took 0.51 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.165 187212 DEBUG oslo.service.loopingcall [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.165 187212 DEBUG nova.compute.manager [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.165 187212 DEBUG nova.network.neutron [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.229 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.248 187212 DEBUG nova.network.neutron [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.380 187212 DEBUG nova.network.neutron [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.420 187212 DEBUG nova.network.neutron [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.452 187212 INFO nova.compute.manager [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Took 0.29 seconds to deallocate network for instance.#033[00m
Dec  5 07:00:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:22Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:4f:38 10.100.0.13
Dec  5 07:00:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:22Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:4f:38 10.100.0.13
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.588 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.588 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.731 187212 DEBUG nova.compute.provider_tree [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.747 187212 DEBUG nova.scheduler.client.report [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.768 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.791 187212 INFO nova.scheduler.client.report [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Deleted allocations for instance 5150eaf5-c0ca-48ab-9045-af5a1c785c8e#033[00m
Dec  5 07:00:22 np0005546909 nova_compute[187208]: 2025-12-05 12:00:22.881 187212 DEBUG oslo_concurrency.lockutils [None req-84a24b55-4095-44dc-aedd-7f3d3b18ffee 28407300b110465d9748f60fa4ee4945 6592a6d983f44d9e94749f0e3e94c689 - - default default] Lock "5150eaf5-c0ca-48ab-9045-af5a1c785c8e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:23 np0005546909 podman[215231]: 2025-12-05 12:00:23.205797862 +0000 UTC m=+0.053328711 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec  5 07:00:23 np0005546909 podman[215232]: 2025-12-05 12:00:23.23795842 +0000 UTC m=+0.083464591 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.251 187212 DEBUG nova.network.neutron [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Updating instance_info_cache with network_info: [{"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.269 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Releasing lock "refresh_cache-4e7aec76-673e-48b5-b183-cc9c7a95fd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.269 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Instance network_info: |[{"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.271 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Start _get_guest_xml network_info=[{"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.275 187212 WARNING nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.279 187212 DEBUG nova.virt.libvirt.host [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.280 187212 DEBUG nova.virt.libvirt.host [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.284 187212 DEBUG nova.virt.libvirt.host [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.284 187212 DEBUG nova.virt.libvirt.host [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.285 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.285 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.285 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.286 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.286 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.286 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.286 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.287 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.287 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.287 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.287 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.287 187212 DEBUG nova.virt.hardware [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.291 187212 DEBUG nova.virt.libvirt.vif [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:00:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-720093205',display_name='tempest-ServersAdminTestJSON-server-720093205',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-720093205',id=16,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-00wbi3mz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:19Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=4e7aec76-673e-48b5-b183-cc9c7a95fd37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.291 187212 DEBUG nova.network.os_vif_util [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.292 187212 DEBUG nova.network.os_vif_util [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.293 187212 DEBUG nova.objects.instance [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4e7aec76-673e-48b5-b183-cc9c7a95fd37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.300 187212 DEBUG nova.compute.manager [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received event network-changed-75a214ef-2b9f-4c81-bdad-de5791244b85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.300 187212 DEBUG nova.compute.manager [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Refreshing instance network info cache due to event network-changed-75a214ef-2b9f-4c81-bdad-de5791244b85. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.300 187212 DEBUG oslo_concurrency.lockutils [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-4e7aec76-673e-48b5-b183-cc9c7a95fd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.300 187212 DEBUG oslo_concurrency.lockutils [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-4e7aec76-673e-48b5-b183-cc9c7a95fd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.300 187212 DEBUG nova.network.neutron [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Refreshing network info cache for port 75a214ef-2b9f-4c81-bdad-de5791244b85 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.314 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  <uuid>4e7aec76-673e-48b5-b183-cc9c7a95fd37</uuid>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  <name>instance-00000010</name>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersAdminTestJSON-server-720093205</nova:name>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:00:23</nova:creationTime>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:        <nova:user uuid="1ac3c267120a4aeaa91f472943c4e1e2">tempest-ServersAdminTestJSON-715947304-project-member</nova:user>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:        <nova:project uuid="98815fe6b9ea4988abc2cccd9726dc86">tempest-ServersAdminTestJSON-715947304</nova:project>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:        <nova:port uuid="75a214ef-2b9f-4c81-bdad-de5791244b85">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <entry name="serial">4e7aec76-673e-48b5-b183-cc9c7a95fd37</entry>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <entry name="uuid">4e7aec76-673e-48b5-b183-cc9c7a95fd37</entry>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.config"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:d9:46:fb"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <target dev="tap75a214ef-2b"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/console.log" append="off"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:00:23 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:00:23 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:00:23 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:00:23 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.315 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Preparing to wait for external event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.315 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.315 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.315 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.316 187212 DEBUG nova.virt.libvirt.vif [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:00:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-720093205',display_name='tempest-ServersAdminTestJSON-server-720093205',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-720093205',id=16,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-00wbi3mz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:19Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=4e7aec76-673e-48b5-b183-cc9c7a95fd37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.316 187212 DEBUG nova.network.os_vif_util [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.317 187212 DEBUG nova.network.os_vif_util [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.317 187212 DEBUG os_vif [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.317 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.317 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.318 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.322 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.322 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75a214ef-2b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.322 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap75a214ef-2b, col_values=(('external_ids', {'iface-id': '75a214ef-2b9f-4c81-bdad-de5791244b85', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:46:fb', 'vm-uuid': '4e7aec76-673e-48b5-b183-cc9c7a95fd37'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.323 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:23 np0005546909 NetworkManager[55691]: <info>  [1764936023.3248] manager: (tap75a214ef-2b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.326 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.329 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.329 187212 INFO os_vif [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b')#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.406 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.407 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.407 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No VIF found with MAC fa:16:3e:d9:46:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.407 187212 INFO nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Using config drive#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.625 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.625 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.648 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.717 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.718 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.727 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.727 187212 INFO nova.compute.claims [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.883 187212 INFO nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Creating config drive at /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.config#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.889 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3mc2egvw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.943 187212 DEBUG nova.compute.provider_tree [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.964 187212 DEBUG nova.scheduler.client.report [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.992 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:23 np0005546909 nova_compute[187208]: 2025-12-05 12:00:23.993 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.015 187212 DEBUG oslo_concurrency.processutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3mc2egvw" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.056 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.057 187212 DEBUG nova.network.neutron [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:00:24 np0005546909 kernel: tap75a214ef-2b: entered promiscuous mode
Dec  5 07:00:24 np0005546909 NetworkManager[55691]: <info>  [1764936024.0741] manager: (tap75a214ef-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.137 187212 INFO nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:00:24 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:24Z|00069|binding|INFO|Claiming lport 75a214ef-2b9f-4c81-bdad-de5791244b85 for this chassis.
Dec  5 07:00:24 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:24Z|00070|binding|INFO|75a214ef-2b9f-4c81-bdad-de5791244b85: Claiming fa:16:3e:d9:46:fb 10.100.0.5
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.140 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.144 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:46:fb 10.100.0.5'], port_security=['fa:16:3e:d9:46:fb 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=75a214ef-2b9f-4c81-bdad-de5791244b85) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.145 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 75a214ef-2b9f-4c81-bdad-de5791244b85 in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.147 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:00:24 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:24Z|00071|binding|INFO|Setting lport 75a214ef-2b9f-4c81-bdad-de5791244b85 ovn-installed in OVS
Dec  5 07:00:24 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:24Z|00072|binding|INFO|Setting lport 75a214ef-2b9f-4c81-bdad-de5791244b85 up in Southbound
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.150 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.162 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.164 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6058b3e2-1e8c-4fff-a777-5cc57e951db8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:24 np0005546909 systemd-machined[153543]: New machine qemu-16-instance-00000010.
Dec  5 07:00:24 np0005546909 systemd-udevd[215320]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:00:24 np0005546909 systemd[1]: Started Virtual Machine qemu-16-instance-00000010.
Dec  5 07:00:24 np0005546909 NetworkManager[55691]: <info>  [1764936024.1891] device (tap75a214ef-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:00:24 np0005546909 NetworkManager[55691]: <info>  [1764936024.1909] device (tap75a214ef-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.195 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[db02f6d0-04e1-4db1-a3a0-e09187c6c966]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.198 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a89eadf8-a4c1-4f1b-a1f6-ca77481e74a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.227 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7e4056c9-c4aa-4e06-82df-7819d195deed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.245 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b0565f3d-2206-4af3-a770-190d8180a4bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215332, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.263 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a3120b14-99a1-485b-a02e-16aa927c68fa]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215334, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215334, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.265 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.266 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.268 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.268 187212 INFO nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Creating image(s)#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.268 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.269 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "/var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.269 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.269 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "/var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.270 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.270 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "/var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.270 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.285 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.286 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.314 187212 DEBUG nova.compute.manager [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.314 187212 DEBUG oslo_concurrency.lockutils [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "04518502-62f1-44c3-8c57-b3404958536f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.315 187212 DEBUG oslo_concurrency.lockutils [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.315 187212 DEBUG oslo_concurrency.lockutils [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "04518502-62f1-44c3-8c57-b3404958536f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.315 187212 DEBUG nova.compute.manager [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] No waiting events found dispatching network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.315 187212 WARNING nova.compute.manager [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received unexpected event network-vif-plugged-06886ab7-aa74-4f44-b509-94e27d585818 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.315 187212 DEBUG nova.compute.manager [req-5c872063-4475-4221-aecc-d2b2c2ff5e91 req-75df4951-1021-4e03-ae51-9fe57718ffe2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Received event network-vif-deleted-06886ab7-aa74-4f44-b509-94e27d585818 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.330 187212 DEBUG nova.policy [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.343 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.343 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.344 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.356 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.420 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.422 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.486 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936024.4862533, 4e7aec76-673e-48b5-b183-cc9c7a95fd37 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.487 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] VM Started (Lifecycle Event)#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.508 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.512 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936024.4910607, 4e7aec76-673e-48b5-b183-cc9c7a95fd37 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.513 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.539 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.541 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.576 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.731 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.731 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.732 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.732 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.732 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.734 187212 INFO nova.compute.manager [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Terminating instance#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.735 187212 DEBUG nova.compute.manager [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:00:24 np0005546909 kernel: tap0d1b5558-65 (unregistering): left promiscuous mode
Dec  5 07:00:24 np0005546909 NetworkManager[55691]: <info>  [1764936024.7563] device (tap0d1b5558-65): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:00:24 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:24Z|00073|binding|INFO|Releasing lport 0d1b5558-6557-43e9-8cac-a00b4e97ea8b from this chassis (sb_readonly=0)
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.761 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:24 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:24Z|00074|binding|INFO|Setting lport 0d1b5558-6557-43e9-8cac-a00b4e97ea8b down in Southbound
Dec  5 07:00:24 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:24Z|00075|binding|INFO|Removing iface tap0d1b5558-65 ovn-installed in OVS
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.764 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.769 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:76:3a 10.1.0.6 fdfe:381f:8400::100'], port_security=['fa:16:3e:05:76:3a 10.1.0.6 fdfe:381f:8400::100'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.6/26 fdfe:381f:8400::100/64', 'neutron:device_id': 'b2e8212c-084c-4a4f-b930-56560ae4da12', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca5a0748-2268-4f31-a673-9ef2606c4273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d28d43c-0f17-4a95-87c9-620fe47e764a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a93d84c4-2884-48aa-b436-9baea579d840, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0d1b5558-6557-43e9-8cac-a00b4e97ea8b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.770 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0d1b5558-6557-43e9-8cac-a00b4e97ea8b in datapath ca5a0748-2268-4f31-a673-9ef2606c4273 unbound from our chassis#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.772 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ca5a0748-2268-4f31-a673-9ef2606c4273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.773 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[748533e2-e0df-4073-86f2-188794e5c4e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:24.773 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273 namespace which is not needed anymore#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.789 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:24 np0005546909 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000006.scope: Deactivated successfully.
Dec  5 07:00:24 np0005546909 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000006.scope: Consumed 14.687s CPU time.
Dec  5 07:00:24 np0005546909 systemd-machined[153543]: Machine qemu-9-instance-00000006 terminated.
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.847 187212 DEBUG nova.network.neutron [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Updated VIF entry in instance network info cache for port 75a214ef-2b9f-4c81-bdad-de5791244b85. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.848 187212 DEBUG nova.network.neutron [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Updating instance_info_cache with network_info: [{"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:24 np0005546909 nova_compute[187208]: 2025-12-05 12:00:24.875 187212 DEBUG oslo_concurrency.lockutils [req-6f24ca02-9740-42b8-a3d7-0102c673bef3 req-fe3a6ebf-f0e0-4e67-9f93-e5be7d451964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-4e7aec76-673e-48b5-b183-cc9c7a95fd37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.000 187212 INFO nova.virt.libvirt.driver [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Instance destroyed successfully.#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.000 187212 DEBUG nova.objects.instance [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'resources' on Instance uuid b2e8212c-084c-4a4f-b930-56560ae4da12 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.015 187212 DEBUG nova.virt.libvirt.vif [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T11:58:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-445293436-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-445293436-3',id=6,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2025-12-05T11:59:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fb2c9c006bee4723bc8dd108e19a6728',ramdisk_id='',reservation_id='r-0ktg9oi1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-275048159',owner_user_name='tempest-AutoAllocateNetworkTest-275048159-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T11:59:58Z,user_data=None,user_id='c4c62f22ba09455995ea1bde6a93431e',uuid=b2e8212c-084c-4a4f-b930-56560ae4da12,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.015 187212 DEBUG nova.network.os_vif_util [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converting VIF {"id": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "address": "fa:16:3e:05:76:3a", "network": {"id": "ca5a0748-2268-4f31-a673-9ef2606c4273", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::100", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fb2c9c006bee4723bc8dd108e19a6728", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d1b5558-65", "ovs_interfaceid": "0d1b5558-6557-43e9-8cac-a00b4e97ea8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.016 187212 DEBUG nova.network.os_vif_util [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.017 187212 DEBUG os_vif [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.018 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.019 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d1b5558-65, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.021 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.023 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.025 187212 INFO os_vif [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:76:3a,bridge_name='br-int',has_traffic_filtering=True,id=0d1b5558-6557-43e9-8cac-a00b4e97ea8b,network=Network(ca5a0748-2268-4f31-a673-9ef2606c4273),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d1b5558-65')#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.026 187212 INFO nova.virt.libvirt.driver [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Deleting instance files /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12_del#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.026 187212 INFO nova.virt.libvirt.driver [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Deletion of /var/lib/nova/instances/b2e8212c-084c-4a4f-b930-56560ae4da12_del complete#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.074 187212 INFO nova.compute.manager [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.075 187212 DEBUG oslo.service.loopingcall [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.075 187212 DEBUG nova.compute.manager [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.075 187212 DEBUG nova.network.neutron [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:00:25 np0005546909 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [NOTICE]   (214277) : haproxy version is 2.8.14-c23fe91
Dec  5 07:00:25 np0005546909 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [NOTICE]   (214277) : path to executable is /usr/sbin/haproxy
Dec  5 07:00:25 np0005546909 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [WARNING]  (214277) : Exiting Master process...
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.171 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk 1073741824" returned: 0 in 0.749s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:25 np0005546909 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [ALERT]    (214277) : Current worker (214279) exited with code 143 (Terminated)
Dec  5 07:00:25 np0005546909 neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273[214273]: [WARNING]  (214277) : All workers exited. Exiting... (0)
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.172 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.828s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.173 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:25 np0005546909 systemd[1]: libpod-99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31.scope: Deactivated successfully.
Dec  5 07:00:25 np0005546909 podman[215370]: 2025-12-05 12:00:25.180848987 +0000 UTC m=+0.298921314 container died 99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 07:00:25 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31-userdata-shm.mount: Deactivated successfully.
Dec  5 07:00:25 np0005546909 systemd[1]: var-lib-containers-storage-overlay-77b012e43fc6df7609492b693cc8452628271339171d87e515263e44dc855891-merged.mount: Deactivated successfully.
Dec  5 07:00:25 np0005546909 podman[215370]: 2025-12-05 12:00:25.224501212 +0000 UTC m=+0.342573539 container cleanup 99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:00:25 np0005546909 systemd[1]: libpod-conmon-99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31.scope: Deactivated successfully.
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.244 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.245 187212 DEBUG nova.virt.disk.api [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Checking if we can resize image /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.245 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:25 np0005546909 podman[215417]: 2025-12-05 12:00:25.289595068 +0000 UTC m=+0.042749200 container remove 99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 07:00:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.295 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1d7b0414-9669-4147-a119-d45322cd9bdb]: (4, ('Fri Dec  5 12:00:24 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273 (99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31)\n99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31\nFri Dec  5 12:00:25 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273 (99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31)\n99643f06a8953a68f22a6725912f636d770850207613587eb88042599eac6e31\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.297 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e8e07b-4d8e-4922-8d4b-651e67212476]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.298 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca5a0748-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:25 np0005546909 kernel: tapca5a0748-20: left promiscuous mode
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.300 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.306 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.306 187212 DEBUG nova.virt.disk.api [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Cannot resize image /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.306 187212 DEBUG nova.objects.instance [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lazy-loading 'migration_context' on Instance uuid bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.311 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.314 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf61bf0-2eaf-4d0a-bce5-19a92d1c7506]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.324 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.324 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Ensure instance console log exists: /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.325 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.325 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.325 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.328 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9663e937-ec5b-457a-9179-965484e70ca0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.330 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5de29532-7e75-476e-ada1-8b087c8be600]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.344 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[db62b7fe-4687-4c09-8a7c-cdf92f83a25f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336377, 'reachable_time': 23474, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215434, 'error': None, 'target': 'ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.357 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ca5a0748-2268-4f31-a673-9ef2606c4273 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:00:25 np0005546909 systemd[1]: run-netns-ovnmeta\x2dca5a0748\x2d2268\x2d4f31\x2da673\x2d9ef2606c4273.mount: Deactivated successfully.
Dec  5 07:00:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:25.358 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[83441355-1a82-4721-b076-88191f048d82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:25Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d0:fa:14 10.100.0.14
Dec  5 07:00:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:25Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d0:fa:14 10.100.0.14
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.762 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936010.76137, 8c58d60e-b997-4eed-8cd4-33ac07d9727a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.763 187212 INFO nova.compute.manager [-] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:00:25 np0005546909 nova_compute[187208]: 2025-12-05 12:00:25.787 187212 DEBUG nova.compute.manager [None req-345c7b66-fc81-463b-bfb6-41120eb76685 - - - - - -] [instance: 8c58d60e-b997-4eed-8cd4-33ac07d9727a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.024 187212 DEBUG nova.network.neutron [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Successfully created port: e56fa29b-453e-4140-997d-96c0de8ed4bb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.027 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "52d63666-4caa-4eaa-9128-6e21189b0932" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.027 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "52d63666-4caa-4eaa-9128-6e21189b0932" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.047 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.112 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.113 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.121 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.121 187212 INFO nova.compute.claims [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.250 187212 DEBUG nova.compute.manager [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.250 187212 DEBUG oslo_concurrency.lockutils [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.251 187212 DEBUG oslo_concurrency.lockutils [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.251 187212 DEBUG oslo_concurrency.lockutils [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.251 187212 DEBUG nova.compute.manager [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Processing event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.251 187212 DEBUG nova.compute.manager [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.251 187212 DEBUG oslo_concurrency.lockutils [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.252 187212 DEBUG oslo_concurrency.lockutils [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.252 187212 DEBUG oslo_concurrency.lockutils [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.252 187212 DEBUG nova.compute.manager [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] No waiting events found dispatching network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.252 187212 WARNING nova.compute.manager [req-4f9ea5e5-c1b1-4dc9-a30e-bf07ea9d55a1 req-940e83be-fb88-4ecf-81cc-d31a56a0bce3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received unexpected event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.253 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.256 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936026.2559118, 4e7aec76-673e-48b5-b183-cc9c7a95fd37 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.256 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.257 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.260 187212 INFO nova.virt.libvirt.driver [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Instance spawned successfully.#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.260 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.284 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.290 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.294 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.295 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.295 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.296 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.296 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.297 187212 DEBUG nova.virt.libvirt.driver [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.323 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.343 187212 DEBUG nova.compute.provider_tree [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.360 187212 INFO nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Took 6.61 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.360 187212 DEBUG nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.363 187212 DEBUG nova.scheduler.client.report [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.405 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.406 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.451 187212 INFO nova.compute.manager [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Took 7.19 seconds to build instance.#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.456 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.484 187212 INFO nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.488 187212 DEBUG oslo_concurrency.lockutils [None req-680b5b58-1f4b-408d-9f92-666494c17335 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.507 187212 DEBUG nova.network.neutron [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.509 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.539 187212 INFO nova.compute.manager [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Took 1.46 seconds to deallocate network for instance.#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.605 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.605 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.608 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.610 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.610 187212 INFO nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating image(s)#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.611 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.611 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.612 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.631 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.697 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.698 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.699 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.709 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.767 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.768 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.806 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.807 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.808 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.867 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.868 187212 DEBUG nova.virt.disk.api [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Checking if we can resize image /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.869 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.890 187212 DEBUG nova.compute.provider_tree [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.909 187212 DEBUG nova.scheduler.client.report [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.927 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.928 187212 DEBUG nova.virt.disk.api [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Cannot resize image /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.929 187212 DEBUG nova.objects.instance [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'migration_context' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.935 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.941 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.942 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Ensure instance console log exists: /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.942 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.943 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.943 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.945 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.949 187212 WARNING nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.955 187212 DEBUG nova.virt.libvirt.host [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.956 187212 DEBUG nova.virt.libvirt.host [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.961 187212 DEBUG nova.virt.libvirt.host [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.962 187212 DEBUG nova.virt.libvirt.host [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.962 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.962 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.963 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.963 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.963 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.964 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.964 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.964 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.964 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.965 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.965 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.965 187212 DEBUG nova.virt.hardware [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.969 187212 DEBUG nova.objects.instance [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'pci_devices' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.971 187212 INFO nova.scheduler.client.report [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Deleted allocations for instance b2e8212c-084c-4a4f-b930-56560ae4da12#033[00m
Dec  5 07:00:26 np0005546909 nova_compute[187208]: 2025-12-05 12:00:26.991 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  <uuid>52d63666-4caa-4eaa-9128-6e21189b0932</uuid>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  <name>instance-00000012</name>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersAdmin275Test-server-1823558123</nova:name>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:00:26</nova:creationTime>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:00:26 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:        <nova:user uuid="3a90749503e34bda87974b2c22626de0">tempest-ServersAdmin275Test-1624449796-project-member</nova:user>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:        <nova:project uuid="6d28e47b844b47238fb8386dae6c546e">tempest-ServersAdmin275Test-1624449796</nova:project>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <entry name="serial">52d63666-4caa-4eaa-9128-6e21189b0932</entry>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <entry name="uuid">52d63666-4caa-4eaa-9128-6e21189b0932</entry>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/console.log" append="off"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:00:26 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:00:26 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:00:26 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:00:26 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.140 187212 DEBUG oslo_concurrency.lockutils [None req-0b46d399-6619-44de-a207-81e24326ce78 c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.409s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.142 187212 DEBUG nova.network.neutron [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Successfully updated port: e56fa29b-453e-4140-997d-96c0de8ed4bb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.168 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.168 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquired lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.168 187212 DEBUG nova.network.neutron [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.171 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.171 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.172 187212 INFO nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Using config drive#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.232 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.342 187212 INFO nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating config drive at /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.348 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2zxw75vb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.371 187212 DEBUG nova.compute.manager [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received event network-vif-unplugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.372 187212 DEBUG oslo_concurrency.lockutils [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.372 187212 DEBUG oslo_concurrency.lockutils [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.373 187212 DEBUG oslo_concurrency.lockutils [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.373 187212 DEBUG nova.compute.manager [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] No waiting events found dispatching network-vif-unplugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.373 187212 WARNING nova.compute.manager [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received unexpected event network-vif-unplugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.373 187212 DEBUG nova.compute.manager [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.373 187212 DEBUG oslo_concurrency.lockutils [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.374 187212 DEBUG oslo_concurrency.lockutils [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.374 187212 DEBUG oslo_concurrency.lockutils [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b2e8212c-084c-4a4f-b930-56560ae4da12-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.374 187212 DEBUG nova.compute.manager [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] No waiting events found dispatching network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.374 187212 WARNING nova.compute.manager [req-f786f039-3a2d-488d-8882-448fe1a2880f req-4dec5ad0-377c-4fa2-bc16-006021f59a00 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received unexpected event network-vif-plugged-0d1b5558-6557-43e9-8cac-a00b4e97ea8b for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.376 187212 DEBUG nova.network.neutron [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.476 187212 DEBUG oslo_concurrency.processutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2zxw75vb" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:27 np0005546909 systemd-machined[153543]: New machine qemu-17-instance-00000012.
Dec  5 07:00:27 np0005546909 systemd[1]: Started Virtual Machine qemu-17-instance-00000012.
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.882 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936012.8819108, e83b5d7d-04a7-44d9-a6fe-580f1cfa5838 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.883 187212 INFO nova.compute.manager [-] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:00:27 np0005546909 nova_compute[187208]: 2025-12-05 12:00:27.905 187212 DEBUG nova.compute.manager [None req-ee5eaabd-82f7-492e-82ad-29c7dedcfaf4 - - - - - -] [instance: e83b5d7d-04a7-44d9-a6fe-580f1cfa5838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.031 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936028.0302062, 52d63666-4caa-4eaa-9128-6e21189b0932 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.031 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.033 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.033 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.036 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance spawned successfully.#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.036 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.056 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.062 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.069 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.070 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.070 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.070 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.071 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.071 187212 DEBUG nova.virt.libvirt.driver [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.079 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.079 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936028.0306618, 52d63666-4caa-4eaa-9128-6e21189b0932 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.079 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Started (Lifecycle Event)#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.098 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.102 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.132 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.151 187212 INFO nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Took 1.54 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.151 187212 DEBUG nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.216 187212 INFO nova.compute.manager [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Took 2.13 seconds to build instance.#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.247 187212 DEBUG oslo_concurrency.lockutils [None req-45650104-12de-45d3-94e6-03353306b032 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "52d63666-4caa-4eaa-9128-6e21189b0932" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.372 187212 DEBUG nova.network.neutron [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Updating instance_info_cache with network_info: [{"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.425 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Releasing lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.425 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Instance network_info: |[{"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.429 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Start _get_guest_xml network_info=[{"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.432 187212 WARNING nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.436 187212 DEBUG nova.virt.libvirt.host [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.437 187212 DEBUG nova.virt.libvirt.host [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.441 187212 DEBUG nova.virt.libvirt.host [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.441 187212 DEBUG nova.virt.libvirt.host [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.442 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.442 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.443 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.443 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.443 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.443 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.444 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.444 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.444 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.445 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.445 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.445 187212 DEBUG nova.virt.hardware [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.448 187212 DEBUG nova.virt.libvirt.vif [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:00:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-2038537603',display_name='tempest-ServersTestJSON-server-2038537603',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-2038537603',id=17,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMtzxCx5WYGaho1uT1vhlFLxIivbdWss7SksmXXR8og/kbnuLPZgB17Trvp/z6Y5aD5/yAlqaXubyiqNS0bESVauUglSuMwk6CT9qVsDlZeY1DXt7lCJ98WxGxUuXIYIrA==',key_name='tempest-keypair-1060401215',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a211e57445104139baeb5ca8fa933c58',ramdisk_id='',reservation_id='r-059q99bz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-2138545093',owner_user_name='tempest-ServersTestJSON-2138545093-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4aa579f9c54f43039ef96c870ed5e049',uuid=bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.449 187212 DEBUG nova.network.os_vif_util [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Converting VIF {"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.449 187212 DEBUG nova.network.os_vif_util [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.451 187212 DEBUG nova.objects.instance [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lazy-loading 'pci_devices' on Instance uuid bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.573 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  <uuid>bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d</uuid>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  <name>instance-00000011</name>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersTestJSON-server-2038537603</nova:name>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:00:28</nova:creationTime>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:        <nova:user uuid="4aa579f9c54f43039ef96c870ed5e049">tempest-ServersTestJSON-2138545093-project-member</nova:user>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:        <nova:project uuid="a211e57445104139baeb5ca8fa933c58">tempest-ServersTestJSON-2138545093</nova:project>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:        <nova:port uuid="e56fa29b-453e-4140-997d-96c0de8ed4bb">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <entry name="serial">bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d</entry>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <entry name="uuid">bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d</entry>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.config"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:37:62:a3"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <target dev="tape56fa29b-45"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/console.log" append="off"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:00:28 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:00:28 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:00:28 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:00:28 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.574 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Preparing to wait for external event network-vif-plugged-e56fa29b-453e-4140-997d-96c0de8ed4bb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.574 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.575 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.575 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.575 187212 DEBUG nova.virt.libvirt.vif [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:00:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-2038537603',display_name='tempest-ServersTestJSON-server-2038537603',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-2038537603',id=17,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMtzxCx5WYGaho1uT1vhlFLxIivbdWss7SksmXXR8og/kbnuLPZgB17Trvp/z6Y5aD5/yAlqaXubyiqNS0bESVauUglSuMwk6CT9qVsDlZeY1DXt7lCJ98WxGxUuXIYIrA==',key_name='tempest-keypair-1060401215',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a211e57445104139baeb5ca8fa933c58',ramdisk_id='',reservation_id='r-059q99bz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-2138545093',owner_user_name='tempest-ServersTestJSON-2138545093-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4aa579f9c54f43039ef96c870ed5e049',uuid=bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.576 187212 DEBUG nova.network.os_vif_util [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Converting VIF {"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.576 187212 DEBUG nova.network.os_vif_util [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.577 187212 DEBUG os_vif [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.578 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.578 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.582 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.582 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape56fa29b-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.583 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape56fa29b-45, col_values=(('external_ids', {'iface-id': 'e56fa29b-453e-4140-997d-96c0de8ed4bb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:37:62:a3', 'vm-uuid': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.584 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:28 np0005546909 NetworkManager[55691]: <info>  [1764936028.5856] manager: (tape56fa29b-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.587 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.591 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.592 187212 INFO os_vif [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45')#033[00m
Dec  5 07:00:28 np0005546909 podman[215482]: 2025-12-05 12:00:28.693720781 +0000 UTC m=+0.064026087 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.759 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.759 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.760 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] No VIF found with MAC fa:16:3e:37:62:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:00:28 np0005546909 nova_compute[187208]: 2025-12-05 12:00:28.760 187212 INFO nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Using config drive#033[00m
Dec  5 07:00:29 np0005546909 nova_compute[187208]: 2025-12-05 12:00:29.042 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936014.0413435, 48f123c5-f925-4f6f-94e5-d109e25ef206 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:29 np0005546909 nova_compute[187208]: 2025-12-05 12:00:29.042 187212 INFO nova.compute.manager [-] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:00:29 np0005546909 nova_compute[187208]: 2025-12-05 12:00:29.067 187212 DEBUG nova.compute.manager [None req-0a6319e9-ff30-44f6-8c43-678eb17e7ac9 - - - - - -] [instance: 48f123c5-f925-4f6f-94e5-d109e25ef206] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:29 np0005546909 nova_compute[187208]: 2025-12-05 12:00:29.186 187212 INFO nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Creating config drive at /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.config#033[00m
Dec  5 07:00:29 np0005546909 nova_compute[187208]: 2025-12-05 12:00:29.191 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8yr3tqos execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:29 np0005546909 nova_compute[187208]: 2025-12-05 12:00:29.316 187212 DEBUG oslo_concurrency.processutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8yr3tqos" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:29 np0005546909 kernel: tape56fa29b-45: entered promiscuous mode
Dec  5 07:00:29 np0005546909 systemd-udevd[215477]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:00:29 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:29Z|00076|binding|INFO|Claiming lport e56fa29b-453e-4140-997d-96c0de8ed4bb for this chassis.
Dec  5 07:00:29 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:29Z|00077|binding|INFO|e56fa29b-453e-4140-997d-96c0de8ed4bb: Claiming fa:16:3e:37:62:a3 10.100.0.3
Dec  5 07:00:29 np0005546909 nova_compute[187208]: 2025-12-05 12:00:29.370 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:29 np0005546909 NetworkManager[55691]: <info>  [1764936029.3747] manager: (tape56fa29b-45): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.377 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:62:a3 10.100.0.3'], port_security=['fa:16:3e:37:62:a3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a211e57445104139baeb5ca8fa933c58', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b4dad292-a18a-4c80-b443-fe4ecc60c1b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2eb759a3-016c-413a-81bd-572c3bccb661, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e56fa29b-453e-4140-997d-96c0de8ed4bb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.380 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e56fa29b-453e-4140-997d-96c0de8ed4bb in datapath 16e72b69-f48e-48c4-b5b8-b2731e24f397 bound to our chassis#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.384 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 16e72b69-f48e-48c4-b5b8-b2731e24f397#033[00m
Dec  5 07:00:29 np0005546909 NetworkManager[55691]: <info>  [1764936029.3922] device (tape56fa29b-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:00:29 np0005546909 NetworkManager[55691]: <info>  [1764936029.3929] device (tape56fa29b-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:00:29 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:29Z|00078|binding|INFO|Setting lport e56fa29b-453e-4140-997d-96c0de8ed4bb ovn-installed in OVS
Dec  5 07:00:29 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:29Z|00079|binding|INFO|Setting lport e56fa29b-453e-4140-997d-96c0de8ed4bb up in Southbound
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.396 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b7064ee2-eb55-4991-b410-f517f1888bb1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.397 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap16e72b69-f1 in ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:00:29 np0005546909 nova_compute[187208]: 2025-12-05 12:00:29.397 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.399 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap16e72b69-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.399 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[92c28497-8ab6-4f20-8178-1cc027a4ef11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.401 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[024ab82c-3cfa-4e5b-9266-1ef25b0faa44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.418 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa8145f-6fc6-43b8-9470-c9565e31a8a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 systemd-machined[153543]: New machine qemu-18-instance-00000011.
Dec  5 07:00:29 np0005546909 systemd[1]: Started Virtual Machine qemu-18-instance-00000011.
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.435 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[90f9faea-74be-41a9-8574-be3c481b2953]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.491 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[afb02701-4d8f-4517-853c-32ec860605ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.510 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9dc80bd8-133e-456c-8a3f-6e22055856d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 systemd-udevd[215519]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:00:29 np0005546909 NetworkManager[55691]: <info>  [1764936029.5119] manager: (tap16e72b69-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.558 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3afc10b4-5f6c-4695-8e01-5b9564b2dd1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.561 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1a5b7d0b-5329-43c7-afdf-b4bdb31f9bf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 NetworkManager[55691]: <info>  [1764936029.5952] device (tap16e72b69-f0): carrier: link connected
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.597 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6f4ebf68-b402-4d64-911b-edf2bcac87c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.613 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0e39357e-1d6d-4b9a-822c-1dc3d7f24d12]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16e72b69-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:3e:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341015, 'reachable_time': 25571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215553, 'error': None, 'target': 'ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.629 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ee418151-51cb-4d6d-9e6f-54291ede2892]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:3ef0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341015, 'tstamp': 341015}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215554, 'error': None, 'target': 'ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.643 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4de32a91-f2e6-4db8-ae51-2be4d991817f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16e72b69-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:3e:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341015, 'reachable_time': 25571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215555, 'error': None, 'target': 'ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.671 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8f0d15c4-7bae-42a0-9e2f-716be73a4021]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.727 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[94c21476-cd3c-4f70-94b2-ae8f3f219525]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.729 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16e72b69-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.729 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.730 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16e72b69-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:29 np0005546909 NetworkManager[55691]: <info>  [1764936029.7335] manager: (tap16e72b69-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Dec  5 07:00:29 np0005546909 nova_compute[187208]: 2025-12-05 12:00:29.741 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:29 np0005546909 kernel: tap16e72b69-f0: entered promiscuous mode
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.745 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap16e72b69-f0, col_values=(('external_ids', {'iface-id': 'ed62467c-0aee-45a7-a6b0-252916dfc244'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:29 np0005546909 nova_compute[187208]: 2025-12-05 12:00:29.746 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:29 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:29Z|00080|binding|INFO|Releasing lport ed62467c-0aee-45a7-a6b0-252916dfc244 from this chassis (sb_readonly=0)
Dec  5 07:00:29 np0005546909 nova_compute[187208]: 2025-12-05 12:00:29.759 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.759 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/16e72b69-f48e-48c4-b5b8-b2731e24f397.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/16e72b69-f48e-48c4-b5b8-b2731e24f397.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.760 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[79a59751-2087-4bd9-87bf-b2945b9ee770]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.761 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-16e72b69-f48e-48c4-b5b8-b2731e24f397
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/16e72b69-f48e-48c4-b5b8-b2731e24f397.pid.haproxy
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 16e72b69-f48e-48c4-b5b8-b2731e24f397
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:00:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:29.762 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'env', 'PROCESS_TAG=haproxy-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/16e72b69-f48e-48c4-b5b8-b2731e24f397.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.023 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936030.0233805, bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.024 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] VM Started (Lifecycle Event)#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.046 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.050 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936030.023566, bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.050 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.078 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.082 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.102 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.299 187212 DEBUG nova.compute.manager [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-changed-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.300 187212 DEBUG nova.compute.manager [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Refreshing instance network info cache due to event network-changed-9275d01b-3eb9-429b-a0ba-0cb60048987a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.302 187212 DEBUG oslo_concurrency.lockutils [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.302 187212 DEBUG oslo_concurrency.lockutils [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:30 np0005546909 nova_compute[187208]: 2025-12-05 12:00:30.302 187212 DEBUG nova.network.neutron [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Refreshing network info cache for port 9275d01b-3eb9-429b-a0ba-0cb60048987a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:00:30 np0005546909 podman[215594]: 2025-12-05 12:00:30.282949246 +0000 UTC m=+0.036033969 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:00:30 np0005546909 podman[215594]: 2025-12-05 12:00:30.371462029 +0000 UTC m=+0.124546692 container create 7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:00:30 np0005546909 systemd[1]: Started libpod-conmon-7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8.scope.
Dec  5 07:00:30 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:00:30 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5b1d7643321fc89fc61c887a583f48ca73e0a371a4b5fe52022732576250580/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:00:30 np0005546909 podman[215594]: 2025-12-05 12:00:30.449166535 +0000 UTC m=+0.202251228 container init 7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:00:30 np0005546909 podman[215594]: 2025-12-05 12:00:30.455413943 +0000 UTC m=+0.208498606 container start 7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  5 07:00:30 np0005546909 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [NOTICE]   (215613) : New worker (215615) forked
Dec  5 07:00:30 np0005546909 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [NOTICE]   (215613) : Loading success.
Dec  5 07:00:31 np0005546909 nova_compute[187208]: 2025-12-05 12:00:31.364 187212 DEBUG nova.compute.manager [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Received event network-vif-deleted-0d1b5558-6557-43e9-8cac-a00b4e97ea8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:31 np0005546909 nova_compute[187208]: 2025-12-05 12:00:31.365 187212 DEBUG nova.compute.manager [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Received event network-changed-e56fa29b-453e-4140-997d-96c0de8ed4bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:31 np0005546909 nova_compute[187208]: 2025-12-05 12:00:31.365 187212 DEBUG nova.compute.manager [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Refreshing instance network info cache due to event network-changed-e56fa29b-453e-4140-997d-96c0de8ed4bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:00:31 np0005546909 nova_compute[187208]: 2025-12-05 12:00:31.366 187212 DEBUG oslo_concurrency.lockutils [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:31 np0005546909 nova_compute[187208]: 2025-12-05 12:00:31.366 187212 DEBUG oslo_concurrency.lockutils [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:31 np0005546909 nova_compute[187208]: 2025-12-05 12:00:31.366 187212 DEBUG nova.network.neutron [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Refreshing network info cache for port e56fa29b-453e-4140-997d-96c0de8ed4bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:00:32 np0005546909 nova_compute[187208]: 2025-12-05 12:00:32.236 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:32 np0005546909 nova_compute[187208]: 2025-12-05 12:00:32.312 187212 DEBUG nova.network.neutron [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updated VIF entry in instance network info cache for port 9275d01b-3eb9-429b-a0ba-0cb60048987a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:00:32 np0005546909 nova_compute[187208]: 2025-12-05 12:00:32.313 187212 DEBUG nova.network.neutron [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updating instance_info_cache with network_info: [{"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:32 np0005546909 nova_compute[187208]: 2025-12-05 12:00:32.331 187212 DEBUG oslo_concurrency.lockutils [req-83d94064-faf4-4c49-814b-d7e58d1a4cbe req-ba81bc48-8ef4-4055-be30-3170a41483e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-597f2994-fdad-46b1-9ef7-f56d62b4bbd0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:32 np0005546909 nova_compute[187208]: 2025-12-05 12:00:32.446 187212 DEBUG nova.network.neutron [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Updated VIF entry in instance network info cache for port e56fa29b-453e-4140-997d-96c0de8ed4bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:00:32 np0005546909 nova_compute[187208]: 2025-12-05 12:00:32.446 187212 DEBUG nova.network.neutron [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Updating instance_info_cache with network_info: [{"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:32 np0005546909 nova_compute[187208]: 2025-12-05 12:00:32.487 187212 DEBUG oslo_concurrency.lockutils [req-3d14a90d-0dbf-4fc4-9bad-2b8ea6d5d823 req-968bf269-ac1c-420c-874a-e667006556cc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:33 np0005546909 nova_compute[187208]: 2025-12-05 12:00:33.585 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:33 np0005546909 nova_compute[187208]: 2025-12-05 12:00:33.774 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:33 np0005546909 nova_compute[187208]: 2025-12-05 12:00:33.775 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:33 np0005546909 nova_compute[187208]: 2025-12-05 12:00:33.797 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:00:33 np0005546909 nova_compute[187208]: 2025-12-05 12:00:33.878 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:33 np0005546909 nova_compute[187208]: 2025-12-05 12:00:33.879 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:33 np0005546909 nova_compute[187208]: 2025-12-05 12:00:33.887 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:00:33 np0005546909 nova_compute[187208]: 2025-12-05 12:00:33.887 187212 INFO nova.compute.claims [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.039 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936019.0387871, 04518502-62f1-44c3-8c57-b3404958536f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.040 187212 INFO nova.compute.manager [-] [instance: 04518502-62f1-44c3-8c57-b3404958536f] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.061 187212 DEBUG nova.compute.manager [None req-e8f0d9d6-36a8-4bda-b42b-a6593efea64c - - - - - -] [instance: 04518502-62f1-44c3-8c57-b3404958536f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.090 187212 DEBUG nova.compute.provider_tree [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.104 187212 DEBUG nova.scheduler.client.report [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.128 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.129 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.169 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.170 187212 DEBUG nova.network.neutron [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.192 187212 INFO nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.207 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.294 187212 INFO nova.compute.manager [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Rebuilding instance#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.308 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.310 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.310 187212 INFO nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Creating image(s)#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.310 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "/var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.311 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.311 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.328 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.401 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.402 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.403 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.414 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.473 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.474 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.513 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.513 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.514 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.530 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.553 187212 DEBUG nova.compute.manager [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.568 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.569 187212 DEBUG nova.virt.disk.api [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Checking if we can resize image /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.569 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.602 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'pci_requests' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.616 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'pci_devices' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.621 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.622 187212 DEBUG nova.virt.disk.api [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Cannot resize image /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.622 187212 DEBUG nova.objects.instance [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'migration_context' on Instance uuid d95c0324-d1d3-4960-9ab7-3a2a098a9f7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.635 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'resources' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.636 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.636 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Ensure instance console log exists: /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.637 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.637 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.637 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.645 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'migration_context' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.656 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.660 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.886 187212 DEBUG nova.policy [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:00:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:34Z|00081|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec  5 07:00:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:34Z|00082|binding|INFO|Releasing lport ed62467c-0aee-45a7-a6b0-252916dfc244 from this chassis (sb_readonly=0)
Dec  5 07:00:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:34Z|00083|binding|INFO|Releasing lport 79bf1a96-6e90-41b7-8356-9756185de59f from this chassis (sb_readonly=0)
Dec  5 07:00:34 np0005546909 nova_compute[187208]: 2025-12-05 12:00:34.946 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:35Z|00084|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec  5 07:00:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:35Z|00085|binding|INFO|Releasing lport ed62467c-0aee-45a7-a6b0-252916dfc244 from this chassis (sb_readonly=0)
Dec  5 07:00:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:35Z|00086|binding|INFO|Releasing lport 79bf1a96-6e90-41b7-8356-9756185de59f from this chassis (sb_readonly=0)
Dec  5 07:00:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:35.053 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:00:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:35.057 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.068 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.105 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.106 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.155 187212 DEBUG nova.compute.manager [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Received event network-vif-plugged-e56fa29b-453e-4140-997d-96c0de8ed4bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.155 187212 DEBUG oslo_concurrency.lockutils [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.155 187212 DEBUG oslo_concurrency.lockutils [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.156 187212 DEBUG oslo_concurrency.lockutils [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.156 187212 DEBUG nova.compute.manager [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Processing event network-vif-plugged-e56fa29b-453e-4140-997d-96c0de8ed4bb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.156 187212 DEBUG nova.compute.manager [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Received event network-vif-plugged-e56fa29b-453e-4140-997d-96c0de8ed4bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.156 187212 DEBUG oslo_concurrency.lockutils [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.156 187212 DEBUG oslo_concurrency.lockutils [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.157 187212 DEBUG oslo_concurrency.lockutils [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.157 187212 DEBUG nova.compute.manager [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] No waiting events found dispatching network-vif-plugged-e56fa29b-453e-4140-997d-96c0de8ed4bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.157 187212 WARNING nova.compute.manager [req-ed249569-d2a9-402b-8aa4-33181c6cf4f9 req-b2aba5a3-4980-40ca-8f9d-a9c00588f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Received unexpected event network-vif-plugged-e56fa29b-453e-4140-997d-96c0de8ed4bb for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.158 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.164 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936035.1645408, bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.165 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.167 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.172 187212 INFO nova.virt.libvirt.driver [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Instance spawned successfully.#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.174 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.185 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:35 np0005546909 podman[215640]: 2025-12-05 12:00:35.196936379 +0000 UTC m=+0.052133476 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.194 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.208 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.209 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.210 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.210 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.210 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.211 187212 DEBUG nova.virt.libvirt.driver [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.275 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.310 187212 INFO nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Took 11.04 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.311 187212 DEBUG nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.373 187212 INFO nova.compute.manager [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Took 11.67 seconds to build instance.#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.391 187212 DEBUG oslo_concurrency.lockutils [None req-5a35cbc0-0662-4dac-a6a6-0af676de8764 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.765s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.430 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.430 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.431 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.431 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid caa6c7c3-7eb3-4636-a7ad-7b605ef393ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.678 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:35.763 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0d7840929940419ee8ea88b703037ee31cfdee552fea31f4c91af9a9732801d7" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec  5 07:00:35 np0005546909 nova_compute[187208]: 2025-12-05 12:00:35.796 187212 DEBUG nova.network.neutron [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Successfully created port: 47612a1a-e470-434b-927c-8fcd6c2fbe4e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:00:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:36.059 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:36 np0005546909 nova_compute[187208]: 2025-12-05 12:00:36.891 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936021.8897574, 5150eaf5-c0ca-48ab-9045-af5a1c785c8e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:36 np0005546909 nova_compute[187208]: 2025-12-05 12:00:36.892 187212 INFO nova.compute.manager [-] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:00:36 np0005546909 nova_compute[187208]: 2025-12-05 12:00:36.947 187212 DEBUG nova.compute.manager [None req-3d62cb01-e3b2-4923-bd48-a2cb40bdfdd6 - - - - - -] [instance: 5150eaf5-c0ca-48ab-9045-af5a1c785c8e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:37.230 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Fri, 05 Dec 2025 12:00:35 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-0db9964f-1820-4db9-887c-2ed75b418e20 x-openstack-request-id: req-0db9964f-1820-4db9-887c-2ed75b418e20 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec  5 07:00:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:37.230 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "09233d41-3279-4f39-ac6e-a21662b4f176", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/09233d41-3279-4f39-ac6e-a21662b4f176"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/09233d41-3279-4f39-ac6e-a21662b4f176"}]}, {"id": "dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec  5 07:00:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:37.230 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-0db9964f-1820-4db9-887c-2ed75b418e20 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec  5 07:00:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:37.232 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0d7840929940419ee8ea88b703037ee31cfdee552fea31f4c91af9a9732801d7" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.537 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.601 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.602 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.603 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.603 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.603 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.604 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.604 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.629 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.629 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.630 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.630 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.745 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.808 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.809 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.829 187212 DEBUG nova.network.neutron [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Successfully updated port: 47612a1a-e470-434b-927c-8fcd6c2fbe4e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.850 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.850 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquired lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.851 187212 DEBUG nova.network.neutron [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.867 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.874 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.930 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.931 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:37 np0005546909 nova_compute[187208]: 2025-12-05 12:00:37.995 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.001 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.057 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.059 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.120 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.127 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.155 187212 DEBUG nova.network.neutron [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.190 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.191 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.252 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.258 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.262 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Fri, 05 Dec 2025 12:00:37 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-09d873fe-7b46-42b6-a563-314e2b893c8c x-openstack-request-id: req-09d873fe-7b46-42b6-a563-314e2b893c8c _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.262 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.262 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f used request id req-09d873fe-7b46-42b6-a563-314e2b893c8c request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.263 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'name': 'tempest-ServersAdmin275Test-server-1823558123', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000012', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6d28e47b844b47238fb8386dae6c546e', 'user_id': '3a90749503e34bda87974b2c22626de0', 'hostId': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.265 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'name': 'tempest-ServersAdminTestJSON-server-1562123791', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98815fe6b9ea4988abc2cccd9726dc86', 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'hostId': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.267 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'name': 'tempest-ServersAdminTestJSON-server-1785289561', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98815fe6b9ea4988abc2cccd9726dc86', 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'hostId': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.268 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'name': 'tempest-ServersTestJSON-server-2038537603', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000011', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a211e57445104139baeb5ca8fa933c58', 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'hostId': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.270 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'name': 'tempest-ServersAdminTestJSON-server-720093205', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000010', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98815fe6b9ea4988abc2cccd9726dc86', 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'hostId': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.271 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'hostId': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.272 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '16d2f26b00364f84b1702bb7219b8d31', 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'hostId': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.272 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.296 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.read.latency volume: 275954440 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.297 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.read.latency volume: 5231646 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.319 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.read.latency volume: 548477667 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.320 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.read.latency volume: 25050026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.351 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.read.latency volume: 293495470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.351 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.read.latency volume: 24220875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.355 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.358 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.387 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.read.latency volume: 191881453 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.387 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.read.latency volume: 718160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.413 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.read.latency volume: 203180993 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.413 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.read.latency volume: 23551307 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.421 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.429 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.442 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.read.latency volume: 275895128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.443 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.read.latency volume: 29196254 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.474 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.read.latency volume: 572759749 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.474 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.read.latency volume: 39616245 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea25c099-7922-43c0-a91f-fb1c4f8e463b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 275954440, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03a51cf4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': 'ef21f6c4907a28df102b4b97d7a358af966b451295ec63b9fa5f24ab4d1a30a8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5231646, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03a530d6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '56d938fc0ba8814f40efb00999f63007d48ef23ec5b1bc15b92db5d1bdca7f34'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 548477667, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03a887ea-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': 'af64c65b95a50fb0f9dc4586814e9aebe47381570eb68579d6fcd565fc5651e4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25050026, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03a893ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': 'b225f25d367bd57b362d84b5e75be71c309dadab4de4e7cf1f306353c13f2fd0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 293495470, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03ad56da-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': 'f69e3deed95380c9d6d91b67868f15efbca27fb1ee189b7d3fa4c33aa40b95b7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24220875, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'i
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03ad61a2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '79a5529c3387874e476d9f8867d7f7a9112df14ca5751fb9c711be1ca26053ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 191881453, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03b2dbaa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '3b7de2bd0daa8c1701b34a61a952c355c56ff6ae11c853adb29c39491fdae355'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 718160, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03b2e85c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '1f37671353e101af48cb8f806b628ee5e4c4b52a6da37bf86a174af28ae64f6f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 203180993, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03b6d03e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'b1782155c1f6742db0b0c00dc6570aa571fd5061ae65a267cbac88cd44036726'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23551307, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03b6db6a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': '5e92a8f476e006504d314c339a3b8fa734ef77f708cae7ad0c6d96cbff5ab137'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 275895128, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03bb5ec4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '2fb5723ed3563d1fe9477acc98534b5f5130e076e5b44ae0fcba07796536ba57'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 29196254, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type':
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: : 'sda'}, 'message_id': '03bb6ef0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '93da2b4b5ee2d712e395817e9ee93ad801666932a3529c86480daaca1be84def'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 572759749, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c01a86-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': '3a91b488dd74af71f02d7e9db6875642d72d468619e1384fb6b0778927d64b2d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39616245, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.272931', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c02544-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': '3fd2055d2ca0c7c968212a1df59aee57793d9cf14008e96f4a948f1b68b1d88d'}]}, 'timestamp': '2025-12-05 12:00:38.474809', '_unique_id': '5af8f6dfd34a4eccb4671312eaa75b8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.487 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.492 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa / tapf194d74d-a9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.492 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.495 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 982a8e69-5181-4847-bdfe-8d4de12bb2e4 / tap380c99a7-94 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.495 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.497 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d / tape56fa29b-45 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.497 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.500 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4e7aec76-673e-48b5-b183-cc9c7a95fd37 / tap75a214ef-2b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.500 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.504 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 / tap9275d01b-3e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.504 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b112dc6-48b4-44ac-8fb0-122210cbf963', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.487503', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03c2ecca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': 'f1c7057e348ef540e88bea30ba2f0effff592bc9e6b23358f145443dabba9eb9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.487503', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03c35886-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': 'a74ad0ec201c7b59ac71d1bd863660a32c139f45b3e4ad48ea3dbf117f3e575e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.487503', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03c3afca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': '51c17405c369a67834b41f925a3656bcc9598856e26ebbce5702aaa365b50415'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.487503', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03c4220c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': 'd15a5e2c6f50ca844e26dc7622f5de456dcd4c7caba4121e0ddde0759b7b4247'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.487503', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03c4b1ea-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': '78cdd9cec6479440979e03a1d853c9e29c6ffd7c61f23bb13c3a182ed92d76e8'}]}, 'timestamp': '2025-12-05 12:00:38.504652', '_unique_id': '00237ce6c1a848219e3878b7fb8e4b4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.505 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.506 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.506 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.506 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>]
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.507 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.507 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.507 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.508 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.508 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.508 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e83a61a7-86b2-4b18-9a43-3bcc8a320413', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.507569', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03c530fc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': '7933392c58636ee107e36747e90a31667b4d1a4881174f95c76dbe61f220e370'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.507569', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03c53da4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': 'dc10f64ceb2eac437985f27f0cd7913289195f99059b8a1c4d6fdb58162944f7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.507569', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03c54b1e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': '8aae540af3cbadb289ca239d634bd69b05ca3ff82ae1d9d97eae73cff9486560'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.507569', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03c5541a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': '0d8f44693d5daf5a0b3e31c37be30151abadd5c003b023efb767427c3f94aaed'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.507569', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03c55cda-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': 'b199710f178da0374535ed18a6611187de2d54a8d1708311228c2055e9865a9e'}]}, 'timestamp': '2025-12-05 12:00:38.508981', '_unique_id': 'c37d91822d7c43ad8cae2f001e050bb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.510 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.write.bytes volume: 17154048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.511 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.511 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.write.bytes volume: 72876032 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.511 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.512 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.write.bytes volume: 72859648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.512 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.512 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.512 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.513 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.write.bytes volume: 26034176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.513 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.513 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.write.bytes volume: 72953856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.514 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.514 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.write.bytes volume: 72892416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.514 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18921d62-561f-48cb-a7fe-924640409c51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 17154048, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c5b388-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '4873610171500068da1f62399948bfd270f98eaa33e1a62b7bb46cc1ef303cb6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c5bfae-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': 'ba8f5911e4b74e078dfe6fd511c35230a07eda87d15b5308f223bcbd145350d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72876032, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c5ca58-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '0ce6d46dd2724f5ad84d987bcd3a00bbf8aa90b877e127a6b8d747ddcc8bd20f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c5d520-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '481b8eb02c0d44662f7cbcfbdbbb7c4be1487d0071469d82714070019cac6a60'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72859648, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c5dfac-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '22d9ab40ac577cf3e5b2bbc857663886fbd2040e3699c72a947b9134291633b2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture'
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: ot_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c5ee2a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '5a751fdde318b1ecd332cca560de7f5788d961d956569f150b2969c0a103a55e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c5f7ee-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': 'e849937f2abb914342e0679b6ec10ad5208186c8f55531fd6d33b33ae646eccc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c60266-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '64ca3d254707b42a45173f7b87a34ce686b919f9505f7cb658f4c2f5f99bdbef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 26034176, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c61134-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'ce58c46054dfa41dbcb4025c42ca6e6ea538ecf9c97ee6df4cebdd6b3e296b15'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c61b0c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': '5ba76c438f159fa6b80a0591d1ac5da3f69f529388dcfcfa435493e8e1ab65a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72953856, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c62552-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '4cadaa9715b8968ffd83ed1887655d9e236a81085d0b7be00b788ff87c491cc7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: ime': 3419.034940067, 'message_signature': 'c7f9960c08eef33e48cd48124739bc968d2346d781bfc121afe84e36fc41bbc4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72892416, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c63d4e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': 'e1b45f8b562cd9aa9520a658c4368dd6356f6aa5c2388a7f0a675e0208f1c1a8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.510946', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c64744-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': '03b0d5a6ae38733b6f8619130f26f4597eba37865c448a2ba908dfc0a715d2e5'}]}, 'timestamp': '2025-12-05 12:00:38.515011', '_unique_id': 'be7233566b93449abe82a524eba1a090'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.517 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.517 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.read.requests volume: 838 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.518 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.read.requests volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.518 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.read.requests volume: 1088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.518 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.519 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.read.requests volume: 1054 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.519 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.519 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.519 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.520 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.read.requests volume: 1034 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.520 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.read.requests volume: 95 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.520 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.read.requests volume: 1104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.521 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.521 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.read.requests volume: 1133 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.521 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.521 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.522 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'accc251c-b1c8-4236-bffb-257ae26c221d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 838, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c6c160-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '857543cfcefd636e7e2b1f9d6d9df079f7e55fb20406c7bd0c15322a1b717a47'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 20, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c6cdb8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '862e83ae44944604bb7444bde126875d220c812c961ffbdceb75f5adfd530db3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1088, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c6db64-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '9b98fcf10cbcfe0d1d31263d2d542f457e93813eb36907a345991f060734db70'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c6e55a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '2e6ca7e5c2f70859218afcdc99dfe8a405196aac19a5330168902918ddd65633'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1054, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c6efdc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '48a9cb4e564c5d7f62b32d10dd21c1052af4e48a49dd0ad8d342b824a01572cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 28, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c6f900-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': 'bc840329355ca347a04d560f06e0e1fc0e108f547480138504ae3919e7264ae4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c706c0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '3fd8681c2cbf4ab1ad46529514f5c6601b15447c7a692ff31f8591dbaebf8ac8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c70fee-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '7502c8be3c9d32b2b32e0f00ff08857da6f4c2b8df433089068611d58374a513'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1034, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c71a34-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'ea8710599d6ece4430575a69b44d9c8ee79e36e4d8cfe9aa84179a1fa33eeff3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 95, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c72402-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'fd92f7058704174e2961dabea841f4213fb8fd4a5f7ab97aaa80ac55fba0bff1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1104, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c7330c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': 'e879040e864430780336a8665b00a18d447dee86189b1438570b756ddebca434'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', '
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: isk_name': 'sda'}, 'message_id': '03c73fe6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '6bc3bc48441d65ff24c5587886a1e3c584fdcfdacf1e6bce8ed449d85a6c0797'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1133, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03c74a0e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': 'c738b231a290d34d24e027436894eb1a44964adb3cccf93a84718e9a9a26be2d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.517781', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c758aa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': 'ae7560d975a82772c3c137962a6022cd09ce64fa1ef6b0cb6e570c595db2fb2d'}]}, 'timestamp': '2025-12-05 12:00:38.522040', '_unique_id': 'bc157b52d87240f494ae4bbd5ac84d74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.524 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.545 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.allocation volume: 29106176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.546 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.555 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.556 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.570 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.570 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.582 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.584 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.585 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.593 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.595 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.596 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.596 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.611 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.allocation volume: 30023680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.612 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.622 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.622 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0ac272a-e50f-4798-a216-326340bf6bd5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29106176, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03cb0ab8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.145586921, 'message_signature': 'f8ec4601230c08f066db4e62f4a9752e0bc9a7cc324b631d7bd356e20e2d8765'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03cb1e18-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.145586921, 'message_signature': 'ef9a7c83a662a05bf67e1a921bae72c786a2f3999ef7945918294a42a0381125'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03cc9b62-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.167784264, 'message_signature': 'ad6255f7e12c6782f4619dbfe4766f9618bfef60a491614abc520d26b79e07ba'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03ccace2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.167784264, 'message_signature': '706274286121935611b6a19190f5e359748b9aa5cdcc2246479c1b4cda60920f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03cece00-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_signature': 'f25bdca3a92f09fdba9360cdd14519bb4c97fff94d653356b60649b5d6be1ba9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_typ
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: me': 'sda'}, 'message_id': '03ced8dc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_signature': 'ae6f86bd744b410c14e2f53f00cf9c32f4a9c58c7e531df439cee65dcf02035c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03d0fa72-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.192265682, 'message_signature': '80a711a346909825eb8d7796c6b41fce1c78c08445b74cfc34c6d383954358f4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03d10c6a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.192265682, 'message_signature': '2faf34bddea40138e02820499a773659a216971eb114732bec035f402b2d9cf8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03d2b5ec-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.206620542, 'message_signature': '787fee2049f6bdf55f93963cbbd18d5f77636483464078abeed87ef4ceede660'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03d2c1c2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.206620542, 'message_signature': '5a919137c299f63787c64f4ab9de8e24adbaafc3747f3caa07fbbefa8ccb2720'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30023680, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03d5270a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.217814101, 'message_signature': '5d7c8f6e3a4587100c30295e3219a20edd6fc028d1f1d2f44601e4b3fbafa3fa'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'me
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: gnature': 'dd64bf683cbb3d781c1720467944b93b3afd6ba48a0b027cf016c108073e9dae'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03d6b5d4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.233943831, 'message_signature': '6e65ad472f2ef89560fc0f31378421aaa804e6a7449543eacea59e539f9938e6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.524402', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03d6c1e6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.233943831, 'message_signature': '997e4caa9f22a21a1d0501d53d92e10ac87081f475e7292a71625bb47e3812b2'}]}, 'timestamp': '2025-12-05 12:00:38.622985', '_unique_id': '48054cd6e19f405bb96e7c594cc0ff29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.624 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.627 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.627 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.627 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.628 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.628 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ffd8d66-2a9c-4a79-9e06-f7e8c3cee6a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.627062', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03d771f4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': '6602d44df4996678c1e969bbd1487e59500d0bc384cd01c253ee5d0dd207558d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.627062', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03d77b18-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': 'dfdec2f65e437bef52017dc8d3962fd592af796ab6c602f69765f2432b2b9f64'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.627062', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03d78a36-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': '404b67b878a1d43617849b741d57106728b2c2b6d806266b091ffe2b77b9f2d9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.627062', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03d795da-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': 'c50e87c9bf3d11f4fa9a7dce93c1ddc90c7de66259956ad7aa1cdf632781cd64'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.627062', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03d7a0ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': 'b1b607a4c77a9c9fe956fb05eda597387e8f14975af75a25b881b159ee48cfa3'}]}, 'timestamp': '2025-12-05 12:00:38.628685', '_unique_id': 'faf7fb3884fd4e139832d657ed2722eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.629 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.631 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.632 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.632 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.632 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.633 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f15334b-45d7-4c2a-b87a-11b5d4844f7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.631784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03d82a72-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': '238dc83090981537d84c5c9f558c448600b5049f7221cc2a4f184a9d81e911a5'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.631784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03d83620-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': 'e21f16743172c92d6290eab70b2931b6b1caa84fb8911b628c38c7a1cfe77cd8'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.631784', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03d84372-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': '84f933b65d16b945250dde57d67824db883bd04d92e4c8fc13d56d3d2114b029'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.631784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03d84ebc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': 'a5cf264ffcdc56a22594e79615aa6f2ca29568415ca742d91193c7f6d10c69d2'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.631784', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03d85c5e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': '62763996172660ac496adfd692739cb5b4e7588c5fee22434b2a4d7a1542c74d'}]}, 'timestamp': '2025-12-05 12:00:38.633485', '_unique_id': '3e95d1448aa54385865eee4a657e4c22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.634 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.636 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.636 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.637 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.637 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.637 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '132d3a01-37bb-4254-8cf6-217b96d30908', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.636488', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03d8e11a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': 'e975572e7b159b8ce5d9a879aa422d97322d957ed98355e657af96f06144b7a0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.636488', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03d8ee26-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': '2e88116828c703ff058d29265a4c77f1b9a58f7a43d69e72fe67328ce9864039'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.636488', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03d8f92a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': '7ba1d5d9e7937ba2d7fe237801894745abee68bcb6d63e7f3f79d91c0e9a39ec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.636488', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03d905c8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': '69791f2c0873951a1ce3af0261fc12bf193496a67dd9959264d8ac6a84a3ac4c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.636488', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03d91676-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': '7f1c85c1546c5422073befc6d61a586b51f7e149fbd46d0812716aeb4057cfc2'}]}, 'timestamp': '2025-12-05 12:00:38.638256', '_unique_id': '89a4dba561dc49af84fc71e511b37d0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.480 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.638 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.639 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.642 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.642 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.642 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.643 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.643 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3aee12e8-e692-4e71-9c51-cbc43772437c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.642178', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03d9bfb8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': 'e92ea10d2ab9817ec8abc33aa549fe970489190d7331a52c026ee1293965ea83'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.642178', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03d9cb0c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': '33b870e2ea91f128328b472eca2c6ddcf011dc62cacf245b87c0ddc3030f49c0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.642178', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03d9d7be-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': 'db0d01415042e9bb28a7c347c3e7c084a9d598ac5f369c9bb1e793686562e908'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.642178', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03d9e038-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': '3bc735a3e05bfeb1490d3552cea1cab587763258b7237cbad528f9063346ce0c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.642178', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03d9f028-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': '4ac6a2b7e8bb3e761fc1c7d70bef3cffcf3147c9dd7f5c8a8b9b637ec887b066'}]}, 'timestamp': '2025-12-05 12:00:38.643830', '_unique_id': '44e9336d3f284a2da80b83e5f5bc0a4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.644 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.645 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.647 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.648 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.648 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.649 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.649 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37effe1e-806b-4a55-8176-8aac73a44e44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.647742', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03da9ba4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': '8206c301336333a0e913303af107100e99c8d40c9778b828222e6f06f448042e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.647742', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03daa900-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': '4ae4362dc1c65bc5e6cbf0d4466fb2a669ed24cd1ca219acb101305f28abf47f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.647742', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03dab9cc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': 'a4a69f3c3f890a8e11d950bac56fe1dd289e97890eb22509ba04c992fd2b34ea'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.647742', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03dac6c4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': '7f77f902c7e0457c695c2beb8207681c24f65c10289aa977ef40e08e4f332391'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.647742', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03dad33a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': 'ee981055c4a1ce8d760d353d020c7a85ca717e3bb13a9e6cdfd50628ff7229f7'}]}, 'timestamp': '2025-12-05 12:00:38.649737', '_unique_id': 'a6cd876f295a4276a86a4b2dc383594c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.650 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.653 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.653 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.write.requests volume: 164 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.654 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.654 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.654 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.655 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.655 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.655 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.656 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.656 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.write.requests volume: 236 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.656 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.657 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.write.requests volume: 320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.657 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.657 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.write.requests volume: 310 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.657 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11fb5b8b-0cc5-48ee-92e4-320df53d249d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 164, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03db83ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '5e3cfaa9a99373051231006d4aa7e2aad20953fb40832afcf8cf0c6189b8ce62'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03db8f64-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '9e22e6ca6db77202345c845170325c75a0ca5713781393bc86833fbf5ee3bdd0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03db9d74-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': 'eefb000370ed6ad9df658939d10f1c2713373c7fb82bfc308a1e3b653a3d5e2e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dba5b2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '4f3b71491e9b86f98fce33b9acadd4f2b624e5512d7b464727369ea3984a6621'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dbb16a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '993fda24a135676ba52a5ad6d99d5459be8d313bd1ba3e7d5d73ef98b95739b5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e'
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 8, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dbbff2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '1b242ffb0772dfba0011703bbfe1aa4131360858f8ec92728ffb170a95e7be8e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dbcdc6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '664214605ad941b569fa9fdf74bf07556b75c6586495e90d946b9a0062a73110'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dbd9a6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '9ef6edb19befbd85455c1e82ba372d5baf3fc584c9f27544683519623be38529'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 236, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dbe50e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': '5d6cd4eeda23095daffad6e7194fa65e77f9c3ab78836444170559727a81c5b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dbf2c4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': '03d7e6acf27b76460601eb4c65f224d6d06b4ebdacd0ba8b8574a9755b285c5e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 320, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dbffda-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '06983a2f446ade7a558f4ac37e10e04502b64fbb4b9bc4df5419dd6ca894734d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: k_name': 'sda'}, 'message_id': '03dc0ae8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': 'e03df298c1b0fc99ca65dfcf5ec6387c9bbde12298da85c08ea3ac156f5c5bf5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 310, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dc170e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': '665e46704c9303bc4fd13585eb9904421cb7fe74d39f8eb569ee562acec24c87'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.653694', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dc226c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': '12c5ffb385319db7003a157a9372a0a2437014b31c5eaa55b153f5b25e7e0db3'}]}, 'timestamp': '2025-12-05 12:00:38.658213', '_unique_id': 'ad42a5d74f7147c29526e6cb76648a41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.659 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.662 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.662 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.662 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.663 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.663 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.663 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '066eff7e-3e42-4b55-9e3e-35e79b32b6f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.661988', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03dcc654-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': '855888e016c891b885bbdaa8225ca75505a8287012fc33b77e1f76c5659b7ce4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.661988', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03dcd2d4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': '91e222cdaa14e7c6be3ebe72c4636f340438d3ca6f82c5e628da4dc9642c3496'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.661988', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03dcde14-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': 'd42690b36cb0e87b6f70f433be91b6b163f5deac1979e51c6a6637727e640654'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.661988', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03dce9b8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': '073ed2fdc319dc7877174cc4e388d60d0bb91f5715df544881fb5cecb4dee2ce'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.661988', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03dcf598-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': 'fa50f9b7b673d459500ab77d9c6770a7804a8f724a1a115717610b6f1c4431f7'}]}, 'timestamp': '2025-12-05 12:00:38.663627', '_unique_id': '3d40a08daef94cbcb0b7eba87b4f6490'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.664 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.665 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.667 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.667 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.667 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.667 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.668 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.668 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.516 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: ot_gb': 1, 'disk_name': 'sda'}, 'message_id': '03c5ee2a-d1d2-11f0-8572-fa163e006 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7425b213-21dc-4cd8-9c81-4d20ed9d0635', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.667263', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03dd9462-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': 'ed38a294fa2ecf482ffb55cdd91abdf6a5c2b805efe2147b0530ee395ab146ec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.667263', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03dda006-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': '9c06f7080de8e586e7ff72480d40f5fea27a2ecddd73489891ac6505f195069f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.667263', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03ddada8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': 'f9462ccfd706b36814fce5312d486bf273e6414b3b1d3bdf5531349940a97eb6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.667263', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03ddbb54-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': '44f1a790a5531aef47eb3a26be38433e19a47afc3884585ed22e753443ad7429'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.667263', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03ddcba8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': '3e40ba938911e2455da7deb64804cce038541058261a4d29f51b421777a8d0c5'}]}, 'timestamp': '2025-12-05 12:00:38.669123', '_unique_id': '1b31b6bf0f384bbea1ba7cc05d9e7747'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.669 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.670 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.672 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.672 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>]
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.673 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.673 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>]
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.673 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.673 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.write.latency volume: 405864442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.674 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.674 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.write.latency volume: 5228938005 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.674 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.675 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.write.latency volume: 3763362476 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.675 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.675 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.675 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.676 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.write.latency volume: 3976242325 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.676 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.676 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.write.latency volume: 4168675680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.677 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.677 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.write.latency volume: 7049264369 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.677 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.523 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 28, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a22e0851-ed3b-4971-9ed0-80d64596d815', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 405864442, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03de9268-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': 'cf93d9dafba0d12568c25b382801c3beb9dc9793972ca3f46c9f462ef191cb8b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03de9e0c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '569f4c64aaa24077611924002337e07de22618cd853b9128c36e894f601155d5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5228938005, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dea6d6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '2a21ef38a14168e9b97ccd84993776a93732273d745df3b7eb666a2e599295c8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03deb482-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '887fec3fd2100501ec9dca7d3cbc59333f309ea0309c0c137542d87202d072d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3763362476, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03debfea-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '13dccc815b258d6da46fb656a814dc5f3f3956eb0013dc5e8d70f85a3fdc5924'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03decbde-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': '705a5622f64131dee71740090b2ab01ada9cf71dc6ecbc394c3f307fad213a5a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03ded6c4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '127aef846059fe1d67c1e3305123b25cab2db8320a9d3631dc75f1b83034781f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dedf70-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '443b7bcd4d5ee9a7e7d0eb82e67846710092942a90ddf3d906cbb1f2ee09bba6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3976242325, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03deec7c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'fe387280febd96b910083821159873ae17a06e8d9ad0e34aa5e5047bf2b01809'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03def884-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'c17c07d8c6748b77568e75ec2a777bf4f610d7b8c0d01aff13b14c3f27ba8327'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4168675680, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03df0388-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '35abb078167b3dc3817cf722cb6d0d73e567579088a5f4518134dcdd68388b59'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb'
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 04-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '0092daecb36638a5f94b16428884e409003d1f1aa54757d4bd54612c837032f4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7049264369, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03df1aa8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': 'e02c3a4d876773d4ee497e3e72a6dad0c9e5cf394205ed72894345767f250361'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.673804', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03df255c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': 'f51f383109c22d7927cde92e8188cb2972999d2f6f4093ef5d3a81791e1f390c'}]}, 'timestamp': '2025-12-05 12:00:38.677947', '_unique_id': '63e7a5fea63645b4ba32c36d1fa72fe8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.679 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.681 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.read.bytes volume: 25349632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.682 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.read.bytes volume: 55474 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.682 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.read.bytes volume: 30292480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.682 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.682 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.read.bytes volume: 29227520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.682 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.682 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.683 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.683 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.read.bytes volume: 28912640 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.683 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.read.bytes volume: 221502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.683 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.read.bytes volume: 30648832 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.683 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.684 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.read.bytes volume: 31009280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.684 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8faecabd-469a-4002-b1bf-a5e22b6b7db9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25349632, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dfc48a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': '10cdbcf7af69375f2c0d36c54ce476a2633853434c11fdb794006602d65c1a8e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 55474, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dfccc8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.89407383, 'message_signature': 'a9593e789a3bcc8ea24e7c6fecb4ded3d30327d2022722df4bf68bc28e8b93d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30292480, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dfd3e4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': 'dd50577d79bb60f20b19f16b3ff796e22606921ff2c09648452eee4f1b74d35b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dfdad8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.919224597, 'message_signature': '095c7965eb7fd843b43391509d7a8273f7b1dccad9ba82ec1bb7019dea2e721a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29227520, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dfe1b8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': 'a837b6206bce83179f5eaf0264ba51854be614ff8bea7129007f18ede9cfca8e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'archi
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: : 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dfe8c0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.941357858, 'message_signature': 'a0e627199d7ee04e124b0842e6fc1295cc6994d263a9487ef39cf057bfff0c74'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03dff234-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '3df30e10a15f17241d2f645960aac856011af39b3e896bee3e7519f8d1137103'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dff96e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3418.972847386, 'message_signature': '08f5d7bb0df8f1d55d8aaf004417ad5f1776fe4ce9f899cfc35f547b6deece85'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28912640, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03e0006c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': '1aeea85dc3e924fc0e0de6c3c4325531a1ff2036a75bac6073316c253499a126'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221502, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03e00774-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.00912062, 'message_signature': 'f7a7a5876fb72853eae51393d2fdb6470e4febc35d7da47bcbefdb47d718d068'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30648832, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03e00e68-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '5b5e3a49a0644128fc455dca86370510caea294787738c39ae600bcc56e7d4b7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'e
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: e006c52', 'monotonic_time': 3419.034940067, 'message_signature': '051b2dc001de3319bc07a999b58e268dabc932c4eda681940d97d739e2f6e4c4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31009280, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03e01d36-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': '9aa3ed12b3898032019fe8778fa9eda7c8fddf796cc43ae61f7d4f95e71cbc99'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.681784', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03e0243e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.064982783, 'message_signature': 'edb1fea5c110d8f5be9fea2cf130ebed1c0bb8811c1def2ef5f520ea6e4b8aaa'}]}, 'timestamp': '2025-12-05 12:00:38.684460', '_unique_id': 'b69e850149ca47908d4b687bd5c650d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.623 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: me': 'sda'}, 'message_id': '03ced8dc-d1d2-11f0-8572-fa163e006c52', 'monotonic_ti [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.685 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.701 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/memory.usage volume: 40.45703125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.713 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/memory.usage volume: 42.76953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.659 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 8, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_ [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.734 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/memory.usage volume: 42.62890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.678 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is:  'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03decbde-d [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.685 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: : 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03dfe8c0-d1d2-11f0-8572-f [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.752 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.761 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.762 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d: ceilometer.compute.pollsters.NoVolumeException
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.775 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/memory.usage volume: 40.4921875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.797 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/memory.usage volume: 40.9296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.813 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/memory.usage volume: 42.30859375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80adce3e-9262-4bea-8a6f-f4a4db671cc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.45703125, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'timestamp': '2025-12-05T12:00:38.686068', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03e2b622-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.321760055, 'message_signature': 'b97c1f3ce6d45a0063dfcaf601e7ade9dcc66fe1221c9694856318691480e25d'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.76953125, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'timestamp': '2025-12-05T12:00:38.686068', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03e49a50-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.334417796, 'message_signature': 'a0ccb9170d8c6e50475ce98e77cd7e5d5dfa6beb8fb9e95e837d5774655369e9'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.62890625, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'timestamp': '2025-12-05T12:00:38.686068', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03e7d3f0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.355400074, 'message_signature': 'fbdcd0fd41c55c05cac32c96c376fc461c56be7b313a02d31ac1f338af69651e'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4921875, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'timestamp': '2025-12-05T12:00:38.686068', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03ee15d0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.39633785, 'message_signature': '43e051b01ec9c14ba4ad07a31c05e99aa36b119b6ce5a90e54a271601dd0819f'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.9296875, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'timestamp': '2025-12-05T12:00:38.686068', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03f17248-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.41843059, 'message_signature': '48a3142921cc54301eedf699056fadc0b082ea42385d1438cad8fb66f98131ec'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.30859375, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'timestamp': '2025-12-05T12:00:38.686068', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03f4099a-d1d2-11f0-8572-fa163e006c
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: b16a715759d6de377e939058571b'}]}, 'timestamp': '2025-12-05 12:00:38.815228', '_unique_id': 'd32e1f0d556e4a5b9b3bfa566905cf40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.818 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.818 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/cpu volume: 9740000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.818 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/cpu volume: 12110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.818 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/cpu volume: 12160000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.818 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/cpu volume: 3270000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.819 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/cpu volume: 11060000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.819 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/cpu volume: 11660000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.819 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/cpu volume: 11610000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5189173-640c-4222-abed-0bae8ddf318d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9740000000, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f499c8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.321760055, 'message_signature': 'e5ba9bf2fb66344133a16d77e20b6e2fae1bdc95bc5e9d725db4114bf3d619ee'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12110000000, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f4a1fc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.334417796, 'message_signature': 'dbd13288a344231e0fa94cdd64243459b99950d8971626fcd2c825f8798f6862'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12160000000, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f4a972-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.355400074, 'message_signature': 'fc0617f7e2bfbee79ed2f64d09001e9ba9e49c866520529bb75c0adcb84a69f6'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3270000000, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f4b37c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.382517636, 'message_signature': '489c605bbc07db52d74459c3040cf09203293642e44a73d3184b6ec8eb79bc8f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11060000000, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f4bea8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.39633785, 'message_signature': 'c7a95d2695a8e185765a587f48d23390528d0ce24b552512cd5b26af9d93830b'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11660000000, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_numbe
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: message_signature': '4e01042460fcd06d015477d77999aa06e8826b9aa10ebed3b38e62af538ec120'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11610000000, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'timestamp': '2025-12-05T12:00:38.818265', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f4cdc6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.43491245, 'message_signature': '26619e08bc40a79d9f2e0ffb2c422b7424e14e6d656d6bc5245a0662f45d53dc'}]}, 'timestamp': '2025-12-05 12:00:38.819890', '_unique_id': '64eba5bdac7242349cfe0035ebf767fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.822 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.822 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.822 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdmin275Test-server-1823558123>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1562123791>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-1785289561>, <NovaLikeServer: tempest-ServersTestJSON-server-2038537603>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-720093205>, <NovaLikeServer: tempest-AutoAllocateNetworkTest-server-2092831344>, <NovaLikeServer: tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876>]
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.822 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.822 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.822 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.823 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.823 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.823 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.823 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.824 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.824 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.824 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.825 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.825 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.825 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.825 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.826 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e0aba0a-65ee-4248-a338-92ec9e6d9b17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f546ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.145586921, 'message_signature': '0dae6ce365469c3dfc584ccf1d86c2d315b7e10bada2ddb0e44e61b891b7ddfb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f54f62-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.145586921, 'message_signature': 'd245fed6a79456f52758eab7c429d9b100633c195662e115ba95d2d5440fbd81'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f55a20-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.167784264, 'message_signature': '545bf0404006c9086ffc8182a9b895c9d4288cf37d4db844ea69dead334587ac'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f5633a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.167784264, 'message_signature': 'fe1d262e2c6e8fe182f71cdbb19872fcea18f0c1e8f9384c3bd736bc44fa8151'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f56ac4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_signature': '1cbd1774b9e6c73029bdfbe15150b612a1b681a17bc3c24719f2ce73c538fad7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: f573e8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_signature': '0f2bb59e1fe935d938ae2eefa05cea834f6d2302f017f3cc060d88408159a1d2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f57bf4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.192265682, 'message_signature': '6f606828ccf6ae18da6e84c5d56aa7e06c1bb516c2dc9223ac5f7310df496fa6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f58b94-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.192265682, 'message_signature': '19b3352a4fd2db16c98de87d15e8632dd3dc8dc971d9e042012dc63b75229b3b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f5933c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.206620542, 'message_signature': 'a034b440c05d550ea81fc738dbdf5c0a772ff0ce014d7d7beee93693d6873f32'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f5a106-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.206620542, 'message_signature': 'b8409814c57e8a61e7b389717248f1e4e26cefd42e495af2d0eaa58c80e9dd09'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f5ae9e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.217814101, 'message_signature': '9ec46d929c826a4d7ec3983426a5a6f2b9f6572b73ee91da2bebd0e9e19461ac'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f5bc72-d1d2-11f0-8572-fa163e006c52', 'monotoni
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 9db0353d5f8f8f8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f5cb86-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.233943831, 'message_signature': '9f68808588bed8b5282bcb65a4d704d336d7ef8d481a65d3438cca940500e875'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.822754', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f5d9aa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.233943831, 'message_signature': 'ccc3cf9686de5c41432ed04b4f63dc74ede97021d637c6362a0e6f7bb16928f6'}]}, 'timestamp': '2025-12-05 12:00:38.826861', '_unique_id': '187a90cf808e416998e1a627c3cfaac3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.830 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.830 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/network.incoming.bytes volume: 1514 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.830 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/network.incoming.bytes volume: 1604 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.830 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.831 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/network.incoming.bytes volume: 784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.831 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de9e2a72-3dfe-4c67-a573-c99df085fdab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1514, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000d-3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-tapf194d74d-a9', 'timestamp': '2025-12-05T12:00:38.830282', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'tapf194d74d-a9', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:fa:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf194d74d-a9'}, 'message_id': '03f66d8e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.110623755, 'message_signature': '1b6bb1cc32150dfaf7e41111cbe48f0bfcf16d079ab93f949244525f961c7e07'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1604, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-0000000c-982a8e69-5181-4847-bdfe-8d4de12bb2e4-tap380c99a7-94', 'timestamp': '2025-12-05T12:00:38.830282', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'tap380c99a7-94', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:4f:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap380c99a7-94'}, 'message_id': '03f67950-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.114460314, 'message_signature': '7bfabc409afff4e719baf7abbdbf56ba395cfe99e6eddac8b93d976382d65088'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'instance-00000011-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-tape56fa29b-45', 'timestamp': '2025-12-05T12:00:38.830282', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'tape56fa29b-45', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:62:a3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape56fa29b-45'}, 'message_id': '03f68666-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.11679276, 'message_signature': '14ea34aa0ca729fdab559b880c7bc6957995c4817d32c1c77a6b2255c1ef4bbe'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 784, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': 'instance-00000010-4e7aec76-673e-48b5-b183-cc9c7a95fd37-tap75a214ef-2b', 'timestamp': '2025-12-05T12:00:38.830282', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'tap75a214ef-2b', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:46:fb', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap75a214ef-2b'}, 'message_id': '03f690e8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.119174828, 'message_signature': 'c12f1678819b064e6e3643546f82f36c442de2210d3d9bbb996ee662121d060e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': 'instance-0000000a-597f2994-fdad-46b1-9ef7-f56d62b4bbd0-tap9275d01b-3e', 'timestamp': '2025-12-05T12:00:38.830282', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'tap9275d01b-3e', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f5:93:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9275d01b-3e'}, 'message_id': '03f69afc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.123564574, 'message_signature': '0aa793697a01e828b570ced5bdc86750323d244be7c42f750436115fe41506ec'}]}, 'timestamp': '2025-12-05 12:00:38.831849', '_unique_id': 'c92d2112dad4434ea958b9eba2972722'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.832 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.833 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.833 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.834 12 DEBUG ceilometer.compute.pollsters [-] 52d63666-4caa-4eaa-9128-6e21189b0932/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.834 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.834 12 DEBUG ceilometer.compute.pollsters [-] 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.834 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.835 12 DEBUG ceilometer.compute.pollsters [-] 982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.835 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.835 12 DEBUG ceilometer.compute.pollsters [-] bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.835 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.836 12 DEBUG ceilometer.compute.pollsters [-] 4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.836 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.837 12 DEBUG ceilometer.compute.pollsters [-] caa6c7c3-7eb3-4636-a7ad-7b605ef393ba/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.837 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.838 12 DEBUG ceilometer.compute.pollsters [-] 597f2994-fdad-46b1-9ef7-f56d62b4bbd0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '674af94b-450d-4499-8e18-6f9c7cea8113', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f6f948-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.145586921, 'message_signature': 'cae1421aee19af202d1133b5e438c0a0ce5214cf213040d291a2f17ad02d0008'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3a90749503e34bda87974b2c22626de0', 'user_name': None, 'project_id': '6d28e47b844b47238fb8386dae6c546e', 'project_name': None, 'resource_id': '52d63666-4caa-4eaa-9128-6e21189b0932-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdmin275Test-server-1823558123', 'name': 'instance-00000012', 'instance_id': '52d63666-4caa-4eaa-9128-6e21189b0932', 'instance_type': 'm1.nano', 'host': 'aa93222660f73b1b2e7f7b4d2aff11977b7f1ac2007c72c23d3882a7', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f701fe-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.145586921, 'message_signature': '8f552493e9a49c14e4385f197e81165008d8031df8a1c330f48eff4bd1c38ea0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f70ad2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.167784264, 'message_signature': '066550a8a213c902b157952cb97478e40cb0b6cb259cd3eb74621233692da1a4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1562123791', 'name': 'instance-0000000d', 'instance_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f7132e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.167784264, 'message_signature': '1e497373d14122752c81a138907c0e73f7cc00532337d7332d45555db012bb95'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f71f90-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_signature': '221ecb6cce6a1d9096956da2ceb0ca6fc391f119c7fec8945c258b27c7b79a3c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1785289561', 'name': 'instance-0000000c', 'instance_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'h
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: sda'}, 'message_id': '03f727a6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_signature': 'dd0e13fc7818ef6704f3419b57f54c71a6cd38f08276e01e52c5c57f209f1359'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f72fa8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.192265682, 'message_signature': '2cc78990e4c22950544bd0b029eee1937a13e09b45059f9d4d0de323be9821af'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '4aa579f9c54f43039ef96c870ed5e049', 'user_name': None, 'project_id': 'a211e57445104139baeb5ca8fa933c58', 'project_name': None, 'resource_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-2038537603', 'name': 'instance-00000011', 'instance_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'instance_type': 'm1.nano', 'host': 'd8b41479b9a7c67017169397962c4cebac1989d97a0b7e072730591b', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f73a0c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.192265682, 'message_signature': 'a370b20be7851507d914a1902767ed6a7b6ed99e5ee50c688bb21872edb77939'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f74916-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.206620542, 'message_signature': '01c4ef46afc639541bd630926ececccf36ab1d9d3db0787578ffeaf4220980de'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1ac3c267120a4aeaa91f472943c4e1e2', 'user_name': None, 'project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'project_name': None, 'resource_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-720093205', 'name': 'instance-00000010', 'instance_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'instance_type': 'm1.nano', 'host': '75f906c1774f5510bc410cc503317215a8141981f407f5fe521a1109', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f76220-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.206620542, 'message_signature': '80f8698a48dab3434ba12917ea2411e1b14171e8fa2ef4c501e62a254ae57215'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f773be-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.217814101, 'message_signature': '844496276426a4f77c5d3570db59ecd9d0693aee362e3d03f85f128aaae186e1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c4c62f22ba09455995ea1bde6a93431e', 'user_name': None, 'project_id': 'fb2c9c006bee4723bc8dd108e19a6728', 'project_name': None, 'resource_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-AutoAllocateNetworkTest-server-2092831344', 'name': 'instance-00000001', 'instance_id': 'caa6c7c3-7eb3-4636-a7ad-7b605ef393ba', 'instance_type': 'm1.nano', 'host': 'da68a9d6eaec9038a25a77fc6c2919593e02778fb95b1c989b15484c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id':
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: '96e6a986c70a8a7b017fa546759cc38fa3459c35f59838ba95d133e8accd6b61'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-vda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f792fe-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.233943831, 'message_signature': '41232417bbc791a8dded9dddb1a385d58f53f1b2085e75991c4411b52b2c723f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd4754b88440a4ea08a37067ef9234672', 'user_name': None, 'project_id': '16d2f26b00364f84b1702bb7219b8d31', 'project_name': None, 'resource_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0-sda', 'timestamp': '2025-12-05T12:00:38.833779', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876', 'name': 'instance-0000000a', 'instance_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'instance_type': 'm1.nano', 'host': '8abdc1f15ec3bf6b7bb1c71cb9bfe22c035833f84ae5c791855d1f74', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f79e34-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.233943831, 'message_signature': '146682dbcb992f1277ac2cd359da855f98a3243723760806ebd0eef58d5b4537'}]}, 'timestamp': '2025-12-05 12:00:38.838448', '_unique_id': 'e4d5ba672a0643dba92cd3b23f6fb9c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:00:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.960 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.816 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.962 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4576MB free_disk=73.19064712524414GB free_vcpus=1 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.962 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:38 np0005546909 nova_compute[187208]: 2025-12-05 12:00:38.962 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.820 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.828 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: f573e8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3419.178065698, 'message_ [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:00:38.839 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: sda'}, 'message_id': '03f727a6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.074 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance caa6c7c3-7eb3-4636-a7ad-7b605ef393ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.075 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.075 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 982a8e69-5181-4847-bdfe-8d4de12bb2e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.075 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.075 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 4e7aec76-673e-48b5-b183-cc9c7a95fd37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.076 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.076 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 52d63666-4caa-4eaa-9128-6e21189b0932 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.076 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance d95c0324-d1d3-4960-9ab7-3a2a098a9f7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.077 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 8 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.077 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1536MB phys_disk=79GB used_disk=8GB total_vcpus=8 used_vcpus=8 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.083 187212 DEBUG nova.compute.manager [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received event network-changed-47612a1a-e470-434b-927c-8fcd6c2fbe4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.083 187212 DEBUG nova.compute.manager [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Refreshing instance network info cache due to event network-changed-47612a1a-e470-434b-927c-8fcd6c2fbe4e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.083 187212 DEBUG oslo_concurrency.lockutils [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.232 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.249 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.268 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.269 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.597 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.597 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.598 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.599 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.599 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.600 187212 INFO nova.compute.manager [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Terminating instance#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.602 187212 DEBUG nova.compute.manager [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:00:39 np0005546909 kernel: tap9275d01b-3e (unregistering): left promiscuous mode
Dec  5 07:00:39 np0005546909 NetworkManager[55691]: <info>  [1764936039.6261] device (tap9275d01b-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:00:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:39Z|00087|binding|INFO|Releasing lport 9275d01b-3eb9-429b-a0ba-0cb60048987a from this chassis (sb_readonly=0)
Dec  5 07:00:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:39Z|00088|binding|INFO|Setting lport 9275d01b-3eb9-429b-a0ba-0cb60048987a down in Southbound
Dec  5 07:00:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:39Z|00089|binding|INFO|Removing iface tap9275d01b-3e ovn-installed in OVS
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.634 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.648 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:93:9d 10.100.0.7'], port_security=['fa:16:3e:f5:93:9d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '597f2994-fdad-46b1-9ef7-f56d62b4bbd0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d2f26b00364f84b1702bb7219b8d31', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ba7f2e39-8114-45e5-bd44-4ae84ab46fc6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd38fa62-d49e-4607-8d3e-179b767c8786, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9275d01b-3eb9-429b-a0ba-0cb60048987a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:00:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.649 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9275d01b-3eb9-429b-a0ba-0cb60048987a in datapath e5a9559e-b860-47a2-b44b-45c7f67f2119 unbound from our chassis#033[00m
Dec  5 07:00:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.651 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e5a9559e-b860-47a2-b44b-45c7f67f2119, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:00:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.652 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ec695521-33d1-41d0-bc3a-9bd60630bf80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.653 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119 namespace which is not needed anymore#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.659 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:39 np0005546909 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Dec  5 07:00:39 np0005546909 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000a.scope: Consumed 14.069s CPU time.
Dec  5 07:00:39 np0005546909 systemd-machined[153543]: Machine qemu-11-instance-0000000a terminated.
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.715 187212 DEBUG nova.network.neutron [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Updating instance_info_cache with network_info: [{"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.726 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.726 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:00:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:39Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d9:46:fb 10.100.0.5
Dec  5 07:00:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:39Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d9:46:fb 10.100.0.5
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.746 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Releasing lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.746 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Instance network_info: |[{"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.746 187212 DEBUG oslo_concurrency.lockutils [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.747 187212 DEBUG nova.network.neutron [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Refreshing network info cache for port 47612a1a-e470-434b-927c-8fcd6c2fbe4e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.749 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Start _get_guest_xml network_info=[{"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.754 187212 WARNING nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.762 187212 DEBUG nova.virt.libvirt.host [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.762 187212 DEBUG nova.virt.libvirt.host [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.769 187212 DEBUG nova.virt.libvirt.host [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.769 187212 DEBUG nova.virt.libvirt.host [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.770 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.770 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.771 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.771 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.771 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.771 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.772 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.772 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.772 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.773 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.773 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.773 187212 DEBUG nova.virt.hardware [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:00:39 np0005546909 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [NOTICE]   (214635) : haproxy version is 2.8.14-c23fe91
Dec  5 07:00:39 np0005546909 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [NOTICE]   (214635) : path to executable is /usr/sbin/haproxy
Dec  5 07:00:39 np0005546909 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [WARNING]  (214635) : Exiting Master process...
Dec  5 07:00:39 np0005546909 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [ALERT]    (214635) : Current worker (214637) exited with code 143 (Terminated)
Dec  5 07:00:39 np0005546909 neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119[214631]: [WARNING]  (214635) : All workers exited. Exiting... (0)
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.777 187212 DEBUG nova.virt.libvirt.vif [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:00:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1974624987',display_name='tempest-ServersAdminTestJSON-server-1974624987',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1974624987',id=19,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-wb0rav5h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:34Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=d95c0324-d1d3-4960-9ab7-3a2a098a9f7c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.777 187212 DEBUG nova.network.os_vif_util [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.778 187212 DEBUG nova.network.os_vif_util [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:39 np0005546909 systemd[1]: libpod-24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c.scope: Deactivated successfully.
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.779 187212 DEBUG nova.objects.instance [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_devices' on Instance uuid d95c0324-d1d3-4960-9ab7-3a2a098a9f7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:39 np0005546909 podman[215764]: 2025-12-05 12:00:39.787071459 +0000 UTC m=+0.045012074 container died 24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.802 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  <uuid>d95c0324-d1d3-4960-9ab7-3a2a098a9f7c</uuid>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  <name>instance-00000013</name>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersAdminTestJSON-server-1974624987</nova:name>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:00:39</nova:creationTime>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:        <nova:user uuid="1ac3c267120a4aeaa91f472943c4e1e2">tempest-ServersAdminTestJSON-715947304-project-member</nova:user>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:        <nova:project uuid="98815fe6b9ea4988abc2cccd9726dc86">tempest-ServersAdminTestJSON-715947304</nova:project>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:        <nova:port uuid="47612a1a-e470-434b-927c-8fcd6c2fbe4e">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <entry name="serial">d95c0324-d1d3-4960-9ab7-3a2a098a9f7c</entry>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <entry name="uuid">d95c0324-d1d3-4960-9ab7-3a2a098a9f7c</entry>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.config"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:45:e2:12"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <target dev="tap47612a1a-e4"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/console.log" append="off"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:00:39 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:00:39 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:00:39 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:00:39 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.804 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Preparing to wait for external event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.804 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.805 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.805 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.806 187212 DEBUG nova.virt.libvirt.vif [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:00:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1974624987',display_name='tempest-ServersAdminTestJSON-server-1974624987',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1974624987',id=19,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-wb0rav5h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:00:34Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=d95c0324-d1d3-4960-9ab7-3a2a098a9f7c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.806 187212 DEBUG nova.network.os_vif_util [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.807 187212 DEBUG nova.network.os_vif_util [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.807 187212 DEBUG os_vif [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.809 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.810 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.813 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.813 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.814 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.814 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.814 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.815 187212 INFO nova.compute.manager [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Terminating instance#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.816 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.816 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquired lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.816 187212 DEBUG nova.network.neutron [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.817 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.818 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47612a1a-e4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.818 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap47612a1a-e4, col_values=(('external_ids', {'iface-id': '47612a1a-e470-434b-927c-8fcd6c2fbe4e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:e2:12', 'vm-uuid': 'd95c0324-d1d3-4960-9ab7-3a2a098a9f7c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:39 np0005546909 NetworkManager[55691]: <info>  [1764936039.8222] manager: (tap47612a1a-e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.826 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:00:39 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c-userdata-shm.mount: Deactivated successfully.
Dec  5 07:00:39 np0005546909 systemd[1]: var-lib-containers-storage-overlay-15ac9401aab02c277b66f0b8b3e087367793eb4fcc0a66aa2c56cb8b76ba06f3-merged.mount: Deactivated successfully.
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.835 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.836 187212 INFO os_vif [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4')#033[00m
Dec  5 07:00:39 np0005546909 podman[215764]: 2025-12-05 12:00:39.858394423 +0000 UTC m=+0.116335028 container cleanup 24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  5 07:00:39 np0005546909 systemd[1]: libpod-conmon-24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c.scope: Deactivated successfully.
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.882 187212 INFO nova.virt.libvirt.driver [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Instance destroyed successfully.#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.883 187212 DEBUG nova.objects.instance [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lazy-loading 'resources' on Instance uuid 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.904 187212 DEBUG nova.virt.libvirt.vif [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T11:59:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876',display_name='tempest-FloatingIPsAssociationNegativeTestJSON-server-147223876',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationnegativetestjson-server-147223876',id=10,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d2f26b00364f84b1702bb7219b8d31',ramdisk_id='',reservation_id='r-5f154sc3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationNegativeTestJSON-4920441',owner_user_name='tempest-FloatingIPsAssociationNegativeTestJSON-4920441-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:00:05Z,user_data=None,user_id='d4754b88440a4ea08a37067ef9234672',uuid=597f2994-fdad-46b1-9ef7-f56d62b4bbd0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.905 187212 DEBUG nova.network.os_vif_util [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Converting VIF {"id": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "address": "fa:16:3e:f5:93:9d", "network": {"id": "e5a9559e-b860-47a2-b44b-45c7f67f2119", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-2084698636-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d2f26b00364f84b1702bb7219b8d31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9275d01b-3e", "ovs_interfaceid": "9275d01b-3eb9-429b-a0ba-0cb60048987a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.906 187212 DEBUG nova.network.os_vif_util [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.906 187212 DEBUG os_vif [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.919 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9275d01b-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.922 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.923 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.926 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.927 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.927 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No VIF found with MAC fa:16:3e:45:e2:12, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.928 187212 INFO nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Using config drive#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.929 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.934 187212 INFO os_vif [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:93:9d,bridge_name='br-int',has_traffic_filtering=True,id=9275d01b-3eb9-429b-a0ba-0cb60048987a,network=Network(e5a9559e-b860-47a2-b44b-45c7f67f2119),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9275d01b-3e')#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.934 187212 INFO nova.virt.libvirt.driver [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Deleting instance files /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0_del#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.935 187212 INFO nova.virt.libvirt.driver [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Deletion of /var/lib/nova/instances/597f2994-fdad-46b1-9ef7-f56d62b4bbd0_del complete#033[00m
Dec  5 07:00:39 np0005546909 podman[215812]: 2025-12-05 12:00:39.938356543 +0000 UTC m=+0.048252577 container remove 24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:00:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.943 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[241c11d4-9fa1-4063-bb99-1dc13fe75f11]: (4, ('Fri Dec  5 12:00:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119 (24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c)\n24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c\nFri Dec  5 12:00:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119 (24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c)\n24a71b63b4a5dd036219aefb08816dbeabf1e9ff309019d01ceb14188d72ea6c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.945 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0a1c5243-4c6d-4fdf-b382-3b2f39abc51a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.946 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5a9559e-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.948 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:39 np0005546909 kernel: tape5a9559e-b0: left promiscuous mode
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.961 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.966 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c0daf182-c846-438a-be46-bbe3a28aa075]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.983 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6aabd3-e749-4848-8888-768c4eaaf536]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:39.986 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43f613d8-2b9b-4aa8-996f-ba9f97a0b590]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.999 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936024.9982498, b2e8212c-084c-4a4f-b930-56560ae4da12 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:39 np0005546909 nova_compute[187208]: 2025-12-05 12:00:39.999 187212 INFO nova.compute.manager [-] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:00:40 np0005546909 nova_compute[187208]: 2025-12-05 12:00:40.000 187212 INFO nova.compute.manager [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:00:40 np0005546909 nova_compute[187208]: 2025-12-05 12:00:40.000 187212 DEBUG oslo.service.loopingcall [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:00:40 np0005546909 nova_compute[187208]: 2025-12-05 12:00:40.001 187212 DEBUG nova.compute.manager [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:00:40 np0005546909 nova_compute[187208]: 2025-12-05 12:00:40.001 187212 DEBUG nova.network.neutron [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:00:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:40.001 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4e85cc0a-7917-42f5-8aec-5fdf73e731b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337401, 'reachable_time': 44825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215832, 'error': None, 'target': 'ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:40 np0005546909 systemd[1]: run-netns-ovnmeta\x2de5a9559e\x2db860\x2d47a2\x2db44b\x2d45c7f67f2119.mount: Deactivated successfully.
Dec  5 07:00:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:40.004 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e5a9559e-b860-47a2-b44b-45c7f67f2119 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:00:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:40.004 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[407bcdf2-95d2-4890-a313-756e5b14ecb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:40 np0005546909 nova_compute[187208]: 2025-12-05 12:00:40.017 187212 DEBUG nova.compute.manager [None req-d112b245-f889-4b6c-81c3-9ef6ff76efdf - - - - - -] [instance: b2e8212c-084c-4a4f-b930-56560ae4da12] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:40 np0005546909 nova_compute[187208]: 2025-12-05 12:00:40.379 187212 DEBUG nova.network.neutron [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:40 np0005546909 nova_compute[187208]: 2025-12-05 12:00:40.553 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:40 np0005546909 NetworkManager[55691]: <info>  [1764936040.5545] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Dec  5 07:00:40 np0005546909 NetworkManager[55691]: <info>  [1764936040.5553] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Dec  5 07:00:40 np0005546909 nova_compute[187208]: 2025-12-05 12:00:40.643 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:40Z|00090|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec  5 07:00:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:40Z|00091|binding|INFO|Releasing lport ed62467c-0aee-45a7-a6b0-252916dfc244 from this chassis (sb_readonly=0)
Dec  5 07:00:40 np0005546909 nova_compute[187208]: 2025-12-05 12:00:40.661 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:41 np0005546909 nova_compute[187208]: 2025-12-05 12:00:41.807 187212 DEBUG nova.network.neutron [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:41 np0005546909 nova_compute[187208]: 2025-12-05 12:00:41.835 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Releasing lock "refresh_cache-caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:41 np0005546909 nova_compute[187208]: 2025-12-05 12:00:41.836 187212 DEBUG nova.compute.manager [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:00:41 np0005546909 nova_compute[187208]: 2025-12-05 12:00:41.852 187212 INFO nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Creating config drive at /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.config#033[00m
Dec  5 07:00:41 np0005546909 nova_compute[187208]: 2025-12-05 12:00:41.858 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuwuxa0jl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:41 np0005546909 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec  5 07:00:41 np0005546909 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 16.976s CPU time.
Dec  5 07:00:41 np0005546909 systemd-machined[153543]: Machine qemu-1-instance-00000001 terminated.
Dec  5 07:00:41 np0005546909 nova_compute[187208]: 2025-12-05 12:00:41.982 187212 DEBUG oslo_concurrency.processutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuwuxa0jl" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.036 187212 DEBUG nova.compute.manager [req-64513919-369a-42cf-b29f-9f57153d81b7 req-156a0a43-134e-4c10-8457-e397711d41a6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-vif-unplugged-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.037 187212 DEBUG oslo_concurrency.lockutils [req-64513919-369a-42cf-b29f-9f57153d81b7 req-156a0a43-134e-4c10-8457-e397711d41a6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.037 187212 DEBUG oslo_concurrency.lockutils [req-64513919-369a-42cf-b29f-9f57153d81b7 req-156a0a43-134e-4c10-8457-e397711d41a6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.038 187212 DEBUG oslo_concurrency.lockutils [req-64513919-369a-42cf-b29f-9f57153d81b7 req-156a0a43-134e-4c10-8457-e397711d41a6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.038 187212 DEBUG nova.compute.manager [req-64513919-369a-42cf-b29f-9f57153d81b7 req-156a0a43-134e-4c10-8457-e397711d41a6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] No waiting events found dispatching network-vif-unplugged-9275d01b-3eb9-429b-a0ba-0cb60048987a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.038 187212 DEBUG nova.compute.manager [req-64513919-369a-42cf-b29f-9f57153d81b7 req-156a0a43-134e-4c10-8457-e397711d41a6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-vif-unplugged-9275d01b-3eb9-429b-a0ba-0cb60048987a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:00:42 np0005546909 kernel: tap47612a1a-e4: entered promiscuous mode
Dec  5 07:00:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:42Z|00092|binding|INFO|Claiming lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e for this chassis.
Dec  5 07:00:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:42Z|00093|binding|INFO|47612a1a-e470-434b-927c-8fcd6c2fbe4e: Claiming fa:16:3e:45:e2:12 10.100.0.10
Dec  5 07:00:42 np0005546909 NetworkManager[55691]: <info>  [1764936042.0671] manager: (tap47612a1a-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.066 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:42 np0005546909 systemd-udevd[215743]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.072 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:e2:12 10.100.0.10'], port_security=['fa:16:3e:45:e2:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd95c0324-d1d3-4960-9ab7-3a2a098a9f7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=47612a1a-e470-434b-927c-8fcd6c2fbe4e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.073 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 47612a1a-e470-434b-927c-8fcd6c2fbe4e in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis#033[00m
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.076 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:00:42 np0005546909 NetworkManager[55691]: <info>  [1764936042.0802] device (tap47612a1a-e4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:00:42 np0005546909 NetworkManager[55691]: <info>  [1764936042.0813] device (tap47612a1a-e4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:00:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:42Z|00094|binding|INFO|Setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e ovn-installed in OVS
Dec  5 07:00:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:42Z|00095|binding|INFO|Setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e up in Southbound
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.084 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.089 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.093 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3b15b4ce-081d-4935-bfde-b692c02f314f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.107 187212 INFO nova.virt.libvirt.driver [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance destroyed successfully.#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.107 187212 DEBUG nova.objects.instance [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lazy-loading 'resources' on Instance uuid caa6c7c3-7eb3-4636-a7ad-7b605ef393ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.110 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.124 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[61f0f223-d298-4fe2-a64b-48942ebe97b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.127 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a88bfd26-2793-4e09-937e-a391f75de20e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.124 187212 INFO nova.virt.libvirt.driver [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Deleting instance files /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba_del#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.125 187212 INFO nova.virt.libvirt.driver [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Deletion of /var/lib/nova/instances/caa6c7c3-7eb3-4636-a7ad-7b605ef393ba_del complete#033[00m
Dec  5 07:00:42 np0005546909 systemd-machined[153543]: New machine qemu-19-instance-00000013.
Dec  5 07:00:42 np0005546909 systemd[1]: Started Virtual Machine qemu-19-instance-00000013.
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.162 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cffd0cc5-ce20-4c06-959d-98862247feb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.183 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e294fd-6bd2-4b9b-b418-c4f5fc9ceede]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215868, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.184 187212 INFO nova.compute.manager [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.185 187212 DEBUG oslo.service.loopingcall [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.185 187212 DEBUG nova.compute.manager [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.185 187212 DEBUG nova.network.neutron [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.201 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[37eab8fc-5d1b-4caa-84ef-5c2dfce1f058]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215872, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215872, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.204 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.206 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.207 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.207 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.207 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.208 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:00:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:00:42.208 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.450 187212 DEBUG nova.network.neutron [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Updated VIF entry in instance network info cache for port 47612a1a-e470-434b-927c-8fcd6c2fbe4e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.451 187212 DEBUG nova.network.neutron [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Updating instance_info_cache with network_info: [{"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.471 187212 DEBUG oslo_concurrency.lockutils [req-bd48c1ac-9832-4a5d-ab30-ba003cd065fe req-29c82e3c-5299-4706-8967-829d15c6f513 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.475 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936042.4756083, d95c0324-d1d3-4960-9ab7-3a2a098a9f7c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.476 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] VM Started (Lifecycle Event)#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.497 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.505 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936042.4757717, d95c0324-d1d3-4960-9ab7-3a2a098a9f7c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.505 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.528 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.533 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.551 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.602 187212 DEBUG nova.network.neutron [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.621 187212 INFO nova.compute.manager [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Took 2.62 seconds to deallocate network for instance.#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.656 187212 DEBUG nova.network.neutron [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.678 187212 DEBUG nova.network.neutron [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.686 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.686 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.691 187212 INFO nova.compute.manager [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Took 0.51 seconds to deallocate network for instance.#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.736 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.863 187212 DEBUG nova.compute.provider_tree [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.880 187212 DEBUG nova.scheduler.client.report [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.900 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.902 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.921 187212 INFO nova.scheduler.client.report [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Deleted allocations for instance 597f2994-fdad-46b1-9ef7-f56d62b4bbd0#033[00m
Dec  5 07:00:42 np0005546909 nova_compute[187208]: 2025-12-05 12:00:42.984 187212 DEBUG oslo_concurrency.lockutils [None req-637bc850-3e45-4de0-8130-2744cd2f6df6 d4754b88440a4ea08a37067ef9234672 16d2f26b00364f84b1702bb7219b8d31 - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:43 np0005546909 nova_compute[187208]: 2025-12-05 12:00:43.067 187212 DEBUG nova.compute.provider_tree [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:43 np0005546909 nova_compute[187208]: 2025-12-05 12:00:43.086 187212 DEBUG nova.scheduler.client.report [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:43 np0005546909 nova_compute[187208]: 2025-12-05 12:00:43.108 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:43 np0005546909 nova_compute[187208]: 2025-12-05 12:00:43.134 187212 INFO nova.scheduler.client.report [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Deleted allocations for instance caa6c7c3-7eb3-4636-a7ad-7b605ef393ba#033[00m
Dec  5 07:00:43 np0005546909 nova_compute[187208]: 2025-12-05 12:00:43.190 187212 DEBUG oslo_concurrency.lockutils [None req-55addd2c-0006-416e-bb41-c01de86929cb c4c62f22ba09455995ea1bde6a93431e fb2c9c006bee4723bc8dd108e19a6728 - - default default] Lock "caa6c7c3-7eb3-4636-a7ad-7b605ef393ba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:43 np0005546909 podman[215883]: 2025-12-05 12:00:43.207855597 +0000 UTC m=+0.057630975 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm)
Dec  5 07:00:44 np0005546909 nova_compute[187208]: 2025-12-05 12:00:44.716 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Dec  5 07:00:44 np0005546909 nova_compute[187208]: 2025-12-05 12:00:44.922 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:45 np0005546909 nova_compute[187208]: 2025-12-05 12:00:45.078 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:45 np0005546909 nova_compute[187208]: 2025-12-05 12:00:45.079 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:45 np0005546909 nova_compute[187208]: 2025-12-05 12:00:45.079 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:45 np0005546909 nova_compute[187208]: 2025-12-05 12:00:45.079 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "597f2994-fdad-46b1-9ef7-f56d62b4bbd0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:45 np0005546909 nova_compute[187208]: 2025-12-05 12:00:45.079 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] No waiting events found dispatching network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:00:45 np0005546909 nova_compute[187208]: 2025-12-05 12:00:45.080 187212 WARNING nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received unexpected event network-vif-plugged-9275d01b-3eb9-429b-a0ba-0cb60048987a for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:00:45 np0005546909 nova_compute[187208]: 2025-12-05 12:00:45.080 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Received event network-changed-e56fa29b-453e-4140-997d-96c0de8ed4bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:45 np0005546909 nova_compute[187208]: 2025-12-05 12:00:45.080 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Refreshing instance network info cache due to event network-changed-e56fa29b-453e-4140-997d-96c0de8ed4bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:00:45 np0005546909 nova_compute[187208]: 2025-12-05 12:00:45.080 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:45 np0005546909 nova_compute[187208]: 2025-12-05 12:00:45.080 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:45 np0005546909 nova_compute[187208]: 2025-12-05 12:00:45.080 187212 DEBUG nova.network.neutron [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Refreshing network info cache for port e56fa29b-453e-4140-997d-96c0de8ed4bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:00:46 np0005546909 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000012.scope: Deactivated successfully.
Dec  5 07:00:46 np0005546909 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000012.scope: Consumed 11.913s CPU time.
Dec  5 07:00:46 np0005546909 systemd-machined[153543]: Machine qemu-17-instance-00000012 terminated.
Dec  5 07:00:47 np0005546909 nova_compute[187208]: 2025-12-05 12:00:47.242 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:47Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:37:62:a3 10.100.0.3
Dec  5 07:00:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:47Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:37:62:a3 10.100.0.3
Dec  5 07:00:47 np0005546909 nova_compute[187208]: 2025-12-05 12:00:47.730 187212 INFO nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance shutdown successfully after 13 seconds.#033[00m
Dec  5 07:00:47 np0005546909 nova_compute[187208]: 2025-12-05 12:00:47.735 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance destroyed successfully.#033[00m
Dec  5 07:00:47 np0005546909 nova_compute[187208]: 2025-12-05 12:00:47.740 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance destroyed successfully.#033[00m
Dec  5 07:00:47 np0005546909 nova_compute[187208]: 2025-12-05 12:00:47.741 187212 INFO nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deleting instance files /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932_del#033[00m
Dec  5 07:00:47 np0005546909 nova_compute[187208]: 2025-12-05 12:00:47.742 187212 INFO nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deletion of /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932_del complete#033[00m
Dec  5 07:00:48 np0005546909 podman[215934]: 2025-12-05 12:00:48.186787833 +0000 UTC m=+0.044862560 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.312 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.312 187212 INFO nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating image(s)#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.313 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.313 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.314 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.315 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.315 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.673 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "c5241646-e089-40a3-b197-60aff60ea075" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.673 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c5241646-e089-40a3-b197-60aff60ea075" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.692 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.805 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.805 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.811 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:00:48 np0005546909 nova_compute[187208]: 2025-12-05 12:00:48.812 187212 INFO nova.compute.claims [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:00:49 np0005546909 podman[215953]: 2025-12-05 12:00:49.230625497 +0000 UTC m=+0.090988375 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350)
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.493 187212 DEBUG nova.network.neutron [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Updated VIF entry in instance network info cache for port e56fa29b-453e-4140-997d-96c0de8ed4bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.494 187212 DEBUG nova.network.neutron [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Updating instance_info_cache with network_info: [{"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.511 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.512 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Received event network-vif-deleted-9275d01b-3eb9-429b-a0ba-0cb60048987a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.512 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.512 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.513 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.513 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.513 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Processing event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.514 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.514 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.515 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.515 187212 DEBUG oslo_concurrency.lockutils [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.515 187212 DEBUG nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] No waiting events found dispatching network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.515 187212 WARNING nova.compute.manager [req-9c75a004-66c1-4016-abb2-045acc417a54 req-7fd0deb8-ee19-483c-b286-f688b332b0a2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received unexpected event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.518 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Instance event wait completed in 7 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.522 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936049.5219681, d95c0324-d1d3-4960-9ab7-3a2a098a9f7c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.522 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.525 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.541 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.542 187212 INFO nova.virt.libvirt.driver [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Instance spawned successfully.#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.543 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.552 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.574 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.577 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.577 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.578 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.578 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.579 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.579 187212 DEBUG nova.virt.libvirt.driver [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.650 187212 INFO nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Took 15.34 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.650 187212 DEBUG nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.706 187212 DEBUG nova.compute.provider_tree [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.809 187212 INFO nova.compute.manager [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Took 15.96 seconds to build instance.#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.816 187212 DEBUG nova.scheduler.client.report [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.828 187212 DEBUG oslo_concurrency.lockutils [None req-89d6a42a-27d5-4fbc-b779-deb5083e1141 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.838 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.839 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.887 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.887 187212 DEBUG nova.network.neutron [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.909 187212 INFO nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.923 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:49 np0005546909 nova_compute[187208]: 2025-12-05 12:00:49.925 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.007 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.008 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.008 187212 INFO nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Creating image(s)#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.009 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "/var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.009 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "/var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.010 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "/var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.023 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:50 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:50Z|00096|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec  5 07:00:50 np0005546909 ovn_controller[95610]: 2025-12-05T12:00:50Z|00097|binding|INFO|Releasing lport ed62467c-0aee-45a7-a6b0-252916dfc244 from this chassis (sb_readonly=0)
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.080 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.081 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.082 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.096 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.151 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.152 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.188 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.189 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.190 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.244 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.246 187212 DEBUG nova.virt.disk.api [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Checking if we can resize image /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.246 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.303 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.305 187212 DEBUG nova.virt.disk.api [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Cannot resize image /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.305 187212 DEBUG nova.objects.instance [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lazy-loading 'migration_context' on Instance uuid c5241646-e089-40a3-b197-60aff60ea075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.318 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.319 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Ensure instance console log exists: /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.320 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.320 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.320 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.415 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.475 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.part --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.477 187212 DEBUG nova.virt.images [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] 6e277715-617f-4e35-89c7-208beae9fd5c was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.479 187212 DEBUG nova.privsep.utils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.480 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.part /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.677 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.part /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.converted" returned: 0 in 0.197s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.682 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.754 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db.converted --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.756 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.774 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.806 187212 DEBUG nova.network.neutron [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.807 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.809 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.814 187212 WARNING nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.818 187212 DEBUG nova.virt.libvirt.host [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.819 187212 DEBUG nova.virt.libvirt.host [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.823 187212 DEBUG nova.virt.libvirt.host [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.824 187212 DEBUG nova.virt.libvirt.host [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.825 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.825 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.826 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.826 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.826 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.826 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.827 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.827 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.827 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.828 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.828 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.828 187212 DEBUG nova.virt.hardware [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.834 187212 DEBUG nova.objects.instance [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lazy-loading 'pci_devices' on Instance uuid c5241646-e089-40a3-b197-60aff60ea075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.848 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.849 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.850 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.860 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.880 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  <uuid>c5241646-e089-40a3-b197-60aff60ea075</uuid>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  <name>instance-00000014</name>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerDiagnosticsNegativeTest-server-1685795102</nova:name>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:00:50</nova:creationTime>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:00:50 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:        <nova:user uuid="32b963f457f74f00ad4c8ac7fa298e83">tempest-ServerDiagnosticsNegativeTest-546957392-project-member</nova:user>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:        <nova:project uuid="38d566f1d23b4fccb2a68f0a7aa78d72">tempest-ServerDiagnosticsNegativeTest-546957392</nova:project>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <entry name="serial">c5241646-e089-40a3-b197-60aff60ea075</entry>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <entry name="uuid">c5241646-e089-40a3-b197-60aff60ea075</entry>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.config"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/console.log" append="off"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:00:50 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:00:50 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:00:50 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:00:50 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.920 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.921 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.949 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.950 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.951 187212 INFO nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Using config drive#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.960 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.961 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:50 np0005546909 nova_compute[187208]: 2025-12-05 12:00:50.961 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.018 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.019 187212 DEBUG nova.virt.disk.api [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Checking if we can resize image /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.020 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.076 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.078 187212 DEBUG nova.virt.disk.api [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Cannot resize image /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.078 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.079 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Ensure instance console log exists: /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.079 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.080 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.080 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.082 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.087 187212 WARNING nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.091 187212 DEBUG nova.virt.libvirt.host [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.092 187212 DEBUG nova.virt.libvirt.host [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.095 187212 DEBUG nova.virt.libvirt.host [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.095 187212 DEBUG nova.virt.libvirt.host [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.095 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.096 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.096 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.096 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.097 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.097 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.097 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.098 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.098 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.098 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.099 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.099 187212 DEBUG nova.virt.hardware [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.099 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.435 187212 DEBUG oslo_concurrency.lockutils [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] Acquiring lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.436 187212 DEBUG oslo_concurrency.lockutils [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] Acquired lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.436 187212 DEBUG nova.network.neutron [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.455 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  <uuid>52d63666-4caa-4eaa-9128-6e21189b0932</uuid>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  <name>instance-00000012</name>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersAdmin275Test-server-1823558123</nova:name>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:00:51</nova:creationTime>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:00:51 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:        <nova:user uuid="3a90749503e34bda87974b2c22626de0">tempest-ServersAdmin275Test-1624449796-project-member</nova:user>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:        <nova:project uuid="6d28e47b844b47238fb8386dae6c546e">tempest-ServersAdmin275Test-1624449796</nova:project>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <entry name="serial">52d63666-4caa-4eaa-9128-6e21189b0932</entry>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <entry name="uuid">52d63666-4caa-4eaa-9128-6e21189b0932</entry>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/console.log" append="off"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:00:51 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:00:51 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:00:51 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:00:51 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.732 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.733 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.734 187212 INFO nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Using config drive#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.741 187212 INFO nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Creating config drive at /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.config#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.747 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfmxcsxbz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.766 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'ec2_ids' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.852 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'keypairs' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:51 np0005546909 nova_compute[187208]: 2025-12-05 12:00:51.871 187212 DEBUG oslo_concurrency.processutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfmxcsxbz" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:51 np0005546909 systemd-machined[153543]: New machine qemu-20-instance-00000014.
Dec  5 07:00:51 np0005546909 systemd[1]: Started Virtual Machine qemu-20-instance-00000014.
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.244 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.424 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936052.4239159, c5241646-e089-40a3-b197-60aff60ea075 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.425 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.428 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.428 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.434 187212 INFO nova.virt.libvirt.driver [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] Instance spawned successfully.#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.434 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.464 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.469 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.470 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.471 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.471 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.472 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.472 187212 DEBUG nova.virt.libvirt.driver [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.477 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.505 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.506 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936052.4248924, c5241646-e089-40a3-b197-60aff60ea075 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.507 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] VM Started (Lifecycle Event)#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.552 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.556 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.604 187212 INFO nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Took 2.60 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.605 187212 DEBUG nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.780 187212 INFO nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating config drive at /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.788 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5putlyy9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.814 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.829 187212 INFO nova.compute.manager [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Took 4.07 seconds to build instance.#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.919 187212 DEBUG oslo_concurrency.processutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5putlyy9" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:52 np0005546909 nova_compute[187208]: 2025-12-05 12:00:52.986 187212 DEBUG oslo_concurrency.lockutils [None req-16cbafaf-904d-4a20-b33a-fb626307c522 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c5241646-e089-40a3-b197-60aff60ea075" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:53 np0005546909 systemd-machined[153543]: New machine qemu-21-instance-00000012.
Dec  5 07:00:53 np0005546909 systemd[1]: Started Virtual Machine qemu-21-instance-00000012.
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.320 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 52d63666-4caa-4eaa-9128-6e21189b0932 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.322 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936053.3198972, 52d63666-4caa-4eaa-9128-6e21189b0932 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.322 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.324 187212 DEBUG nova.compute.manager [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.325 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.327 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance spawned successfully.#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.328 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.579 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.591 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.594 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.595 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.595 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.595 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.596 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.596 187212 DEBUG nova.virt.libvirt.driver [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.663 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.663 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936053.3206246, 52d63666-4caa-4eaa-9128-6e21189b0932 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.663 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Started (Lifecycle Event)#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.685 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.688 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.886 187212 DEBUG nova.compute.manager [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:53 np0005546909 nova_compute[187208]: 2025-12-05 12:00:53.891 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:00:54 np0005546909 nova_compute[187208]: 2025-12-05 12:00:54.042 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:54 np0005546909 nova_compute[187208]: 2025-12-05 12:00:54.042 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:54 np0005546909 nova_compute[187208]: 2025-12-05 12:00:54.043 187212 DEBUG nova.objects.instance [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:00:54 np0005546909 nova_compute[187208]: 2025-12-05 12:00:54.195 187212 DEBUG oslo_concurrency.lockutils [None req-d324fed4-1db3-47d9-9411-bbb238b5cd7c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:54 np0005546909 podman[216075]: 2025-12-05 12:00:54.275639236 +0000 UTC m=+0.070275035 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:00:54 np0005546909 podman[216076]: 2025-12-05 12:00:54.306063933 +0000 UTC m=+0.097971994 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 07:00:54 np0005546909 nova_compute[187208]: 2025-12-05 12:00:54.786 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "897abc63-6217-4009-a547-8799c4621feb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:54 np0005546909 nova_compute[187208]: 2025-12-05 12:00:54.786 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "897abc63-6217-4009-a547-8799c4621feb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:54 np0005546909 nova_compute[187208]: 2025-12-05 12:00:54.812 187212 DEBUG nova.network.neutron [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Updating instance_info_cache with network_info: [{"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:54 np0005546909 nova_compute[187208]: 2025-12-05 12:00:54.878 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936039.8766558, 597f2994-fdad-46b1-9ef7-f56d62b4bbd0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:54 np0005546909 nova_compute[187208]: 2025-12-05 12:00:54.879 187212 INFO nova.compute.manager [-] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:00:54 np0005546909 nova_compute[187208]: 2025-12-05 12:00:54.925 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.097 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.131 187212 DEBUG nova.compute.manager [None req-d47e53a1-6326-4c96-bfdf-45a8447c3ec7 - - - - - -] [instance: 597f2994-fdad-46b1-9ef7-f56d62b4bbd0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.136 187212 DEBUG oslo_concurrency.lockutils [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] Releasing lock "refresh_cache-d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.136 187212 DEBUG nova.compute.manager [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.136 187212 DEBUG nova.compute.manager [None req-c4298efe-1694-40bf-b2c5-b639c07610c9 db66f1e7359546c7a8219446661e92b3 bc9a7beb195a46e48037c8a50bb62e7a - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] network_info to inject: |[{"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.154 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "c5241646-e089-40a3-b197-60aff60ea075" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.155 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c5241646-e089-40a3-b197-60aff60ea075" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.155 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "c5241646-e089-40a3-b197-60aff60ea075-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.155 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c5241646-e089-40a3-b197-60aff60ea075-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.156 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c5241646-e089-40a3-b197-60aff60ea075-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.157 187212 INFO nova.compute.manager [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Terminating instance#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.157 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "refresh_cache-c5241646-e089-40a3-b197-60aff60ea075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.158 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquired lock "refresh_cache-c5241646-e089-40a3-b197-60aff60ea075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.158 187212 DEBUG nova.network.neutron [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.189 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.190 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.196 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.196 187212 INFO nova.compute.claims [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.522 187212 DEBUG nova.network.neutron [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.701 187212 DEBUG nova.compute.provider_tree [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.718 187212 DEBUG nova.scheduler.client.report [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.747 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.748 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.796 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.809 187212 INFO nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.824 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.901 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.902 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.903 187212 INFO nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Creating image(s)#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.904 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "/var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.904 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "/var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.905 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "/var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:55 np0005546909 nova_compute[187208]: 2025-12-05 12:00:55.921 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.010 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.011 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.012 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.029 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.098 187212 DEBUG nova.network.neutron [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.102 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.103 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.128 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Releasing lock "refresh_cache-c5241646-e089-40a3-b197-60aff60ea075" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.129 187212 DEBUG nova.compute.manager [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.157 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.158 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.159 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:56 np0005546909 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000014.scope: Deactivated successfully.
Dec  5 07:00:56 np0005546909 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000014.scope: Consumed 4.176s CPU time.
Dec  5 07:00:56 np0005546909 systemd-machined[153543]: Machine qemu-20-instance-00000014 terminated.
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.248 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.250 187212 DEBUG nova.virt.disk.api [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Checking if we can resize image /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.251 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.311 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.313 187212 DEBUG nova.virt.disk.api [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Cannot resize image /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.313 187212 DEBUG nova.objects.instance [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lazy-loading 'migration_context' on Instance uuid 897abc63-6217-4009-a547-8799c4621feb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.327 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.327 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Ensure instance console log exists: /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.328 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.328 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.328 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.330 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.336 187212 WARNING nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.356 187212 DEBUG nova.virt.libvirt.host [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.358 187212 DEBUG nova.virt.libvirt.host [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.371 187212 DEBUG nova.virt.libvirt.host [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.373 187212 DEBUG nova.virt.libvirt.host [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.373 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.373 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.374 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.374 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.375 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.375 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.375 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.376 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.376 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.376 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.377 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.377 187212 DEBUG nova.virt.hardware [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.382 187212 DEBUG nova.objects.instance [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lazy-loading 'pci_devices' on Instance uuid 897abc63-6217-4009-a547-8799c4621feb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.387 187212 INFO nova.virt.libvirt.driver [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] Instance destroyed successfully.#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.387 187212 DEBUG nova.objects.instance [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lazy-loading 'resources' on Instance uuid c5241646-e089-40a3-b197-60aff60ea075 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.414 187212 INFO nova.virt.libvirt.driver [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Deleting instance files /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075_del#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.415 187212 INFO nova.virt.libvirt.driver [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Deletion of /var/lib/nova/instances/c5241646-e089-40a3-b197-60aff60ea075_del complete#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.420 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  <uuid>897abc63-6217-4009-a547-8799c4621feb</uuid>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  <name>instance-00000015</name>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerDiagnosticsV248Test-server-1605716829</nova:name>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:00:56</nova:creationTime>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:00:56 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:        <nova:user uuid="3777f30c4e2e4644912c2ef76a3ea2c0">tempest-ServerDiagnosticsV248Test-494221988-project-member</nova:user>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:        <nova:project uuid="7a8c57ca06ea434e98ac6900d68e5c27">tempest-ServerDiagnosticsV248Test-494221988</nova:project>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <entry name="serial">897abc63-6217-4009-a547-8799c4621feb</entry>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <entry name="uuid">897abc63-6217-4009-a547-8799c4621feb</entry>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.config"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/console.log" append="off"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:00:56 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:00:56 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:00:56 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:00:56 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.475 187212 INFO nova.compute.manager [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] [instance: c5241646-e089-40a3-b197-60aff60ea075] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.476 187212 DEBUG oslo.service.loopingcall [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.477 187212 DEBUG nova.compute.manager [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.477 187212 DEBUG nova.network.neutron [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.486 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.486 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.487 187212 INFO nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Using config drive#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.890 187212 INFO nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Creating config drive at /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.config#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.899 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp52v8narm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.954 187212 DEBUG nova.network.neutron [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.976 187212 DEBUG nova.network.neutron [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:00:56 np0005546909 nova_compute[187208]: 2025-12-05 12:00:56.992 187212 INFO nova.compute.manager [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] Took 0.51 seconds to deallocate network for instance.#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.023 187212 DEBUG oslo_concurrency.processutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp52v8narm" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.042 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.043 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:00:57 np0005546909 systemd-machined[153543]: New machine qemu-22-instance-00000015.
Dec  5 07:00:57 np0005546909 systemd[1]: Started Virtual Machine qemu-22-instance-00000015.
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.104 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936042.1026633, caa6c7c3-7eb3-4636-a7ad-7b605ef393ba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.105 187212 INFO nova.compute.manager [-] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.106 187212 INFO nova.compute.manager [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Rebuilding instance#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.133 187212 DEBUG nova.compute.manager [None req-034d3c6e-a4fd-4486-9db6-ec82f682dea7 - - - - - -] [instance: caa6c7c3-7eb3-4636-a7ad-7b605ef393ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.199 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.248 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.283 187212 DEBUG nova.compute.provider_tree [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.297 187212 DEBUG nova.scheduler.client.report [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.326 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.346 187212 INFO nova.scheduler.client.report [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Deleted allocations for instance c5241646-e089-40a3-b197-60aff60ea075#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.366 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.393 187212 DEBUG nova.compute.manager [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.430 187212 DEBUG oslo_concurrency.lockutils [None req-61e9611b-62ab-4a26-aa3c-9fd10ebd95df 32b963f457f74f00ad4c8ac7fa298e83 38d566f1d23b4fccb2a68f0a7aa78d72 - - default default] Lock "c5241646-e089-40a3-b197-60aff60ea075" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.448 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'pci_requests' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.462 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.474 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'resources' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.484 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'migration_context' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.496 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.502 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.531 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936057.5310698, 897abc63-6217-4009-a547-8799c4621feb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.531 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.534 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.535 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.538 187212 INFO nova.virt.libvirt.driver [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] Instance spawned successfully.#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.539 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.553 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.556 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.563 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.563 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.563 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.564 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.564 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.565 187212 DEBUG nova.virt.libvirt.driver [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.573 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.573 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936057.5338352, 897abc63-6217-4009-a547-8799c4621feb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.573 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] VM Started (Lifecycle Event)#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.594 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.600 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.621 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.632 187212 INFO nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Took 1.73 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.632 187212 DEBUG nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.686 187212 INFO nova.compute.manager [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Took 2.52 seconds to build instance.#033[00m
Dec  5 07:00:57 np0005546909 nova_compute[187208]: 2025-12-05 12:00:57.701 187212 DEBUG oslo_concurrency.lockutils [None req-c110f300-7659-481f-b616-570892c9111e 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "897abc63-6217-4009-a547-8799c4621feb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.915s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:00:59 np0005546909 podman[216172]: 2025-12-05 12:00:59.208095127 +0000 UTC m=+0.062777791 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  5 07:00:59 np0005546909 nova_compute[187208]: 2025-12-05 12:00:59.739 187212 DEBUG nova.compute.manager [None req-5db75d0d-2d1b-4d3f-b9ea-3a193a47a305 a19dd465f6924cd8a015d5ec028d2e21 4ee519a917b34232836644d7ed32c09c - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:00:59 np0005546909 nova_compute[187208]: 2025-12-05 12:00:59.742 187212 INFO nova.compute.manager [None req-5db75d0d-2d1b-4d3f-b9ea-3a193a47a305 a19dd465f6924cd8a015d5ec028d2e21 4ee519a917b34232836644d7ed32c09c - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Retrieving diagnostics#033[00m
Dec  5 07:00:59 np0005546909 nova_compute[187208]: 2025-12-05 12:00:59.966 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:00 np0005546909 nova_compute[187208]: 2025-12-05 12:01:00.458 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.125 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.125 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.125 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.125 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.126 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.127 187212 INFO nova.compute.manager [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Terminating instance#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.127 187212 DEBUG nova.compute.manager [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:01:01 np0005546909 kernel: tape56fa29b-45 (unregistering): left promiscuous mode
Dec  5 07:01:01 np0005546909 NetworkManager[55691]: <info>  [1764936061.1488] device (tape56fa29b-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.160 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:01Z|00098|binding|INFO|Releasing lport e56fa29b-453e-4140-997d-96c0de8ed4bb from this chassis (sb_readonly=0)
Dec  5 07:01:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:01Z|00099|binding|INFO|Setting lport e56fa29b-453e-4140-997d-96c0de8ed4bb down in Southbound
Dec  5 07:01:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:01Z|00100|binding|INFO|Removing iface tape56fa29b-45 ovn-installed in OVS
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.173 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:62:a3 10.100.0.3'], port_security=['fa:16:3e:37:62:a3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a211e57445104139baeb5ca8fa933c58', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b4dad292-a18a-4c80-b443-fe4ecc60c1b5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.217'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2eb759a3-016c-413a-81bd-572c3bccb661, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e56fa29b-453e-4140-997d-96c0de8ed4bb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.175 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e56fa29b-453e-4140-997d-96c0de8ed4bb in datapath 16e72b69-f48e-48c4-b5b8-b2731e24f397 unbound from our chassis#033[00m
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.176 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16e72b69-f48e-48c4-b5b8-b2731e24f397, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.178 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bead7b02-99c3-4c50-8a7c-8cde22c671ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.179 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397 namespace which is not needed anymore#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.190 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:01 np0005546909 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000011.scope: Deactivated successfully.
Dec  5 07:01:01 np0005546909 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000011.scope: Consumed 13.056s CPU time.
Dec  5 07:01:01 np0005546909 systemd-machined[153543]: Machine qemu-18-instance-00000011 terminated.
Dec  5 07:01:01 np0005546909 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [NOTICE]   (215613) : haproxy version is 2.8.14-c23fe91
Dec  5 07:01:01 np0005546909 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [NOTICE]   (215613) : path to executable is /usr/sbin/haproxy
Dec  5 07:01:01 np0005546909 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [WARNING]  (215613) : Exiting Master process...
Dec  5 07:01:01 np0005546909 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [ALERT]    (215613) : Current worker (215615) exited with code 143 (Terminated)
Dec  5 07:01:01 np0005546909 neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397[215609]: [WARNING]  (215613) : All workers exited. Exiting... (0)
Dec  5 07:01:01 np0005546909 systemd[1]: libpod-7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8.scope: Deactivated successfully.
Dec  5 07:01:01 np0005546909 podman[216233]: 2025-12-05 12:01:01.323960147 +0000 UTC m=+0.046388894 container died 7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec  5 07:01:01 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8-userdata-shm.mount: Deactivated successfully.
Dec  5 07:01:01 np0005546909 systemd[1]: var-lib-containers-storage-overlay-e5b1d7643321fc89fc61c887a583f48ca73e0a371a4b5fe52022732576250580-merged.mount: Deactivated successfully.
Dec  5 07:01:01 np0005546909 podman[216233]: 2025-12-05 12:01:01.420139731 +0000 UTC m=+0.142568478 container cleanup 7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Dec  5 07:01:01 np0005546909 systemd[1]: libpod-conmon-7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8.scope: Deactivated successfully.
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.433 187212 INFO nova.virt.libvirt.driver [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Instance destroyed successfully.#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.434 187212 DEBUG nova.objects.instance [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lazy-loading 'resources' on Instance uuid bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.455 187212 DEBUG nova.virt.libvirt.vif [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:00:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-2038537603',display_name='tempest-ServersTestJSON-server-2038537603',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-2038537603',id=17,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMtzxCx5WYGaho1uT1vhlFLxIivbdWss7SksmXXR8og/kbnuLPZgB17Trvp/z6Y5aD5/yAlqaXubyiqNS0bESVauUglSuMwk6CT9qVsDlZeY1DXt7lCJ98WxGxUuXIYIrA==',key_name='tempest-keypair-1060401215',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a211e57445104139baeb5ca8fa933c58',ramdisk_id='',reservation_id='r-059q99bz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-2138545093',owner_user_name='tempest-ServersTestJSON-2138545093-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:00:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4aa579f9c54f43039ef96c870ed5e049',uuid=bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.455 187212 DEBUG nova.network.os_vif_util [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Converting VIF {"id": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "address": "fa:16:3e:37:62:a3", "network": {"id": "16e72b69-f48e-48c4-b5b8-b2731e24f397", "bridge": "br-int", "label": "tempest-ServersTestJSON-1662179596-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a211e57445104139baeb5ca8fa933c58", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape56fa29b-45", "ovs_interfaceid": "e56fa29b-453e-4140-997d-96c0de8ed4bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.456 187212 DEBUG nova.network.os_vif_util [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.456 187212 DEBUG os_vif [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.459 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.459 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape56fa29b-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.460 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.462 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.465 187212 INFO os_vif [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:62:a3,bridge_name='br-int',has_traffic_filtering=True,id=e56fa29b-453e-4140-997d-96c0de8ed4bb,network=Network(16e72b69-f48e-48c4-b5b8-b2731e24f397),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape56fa29b-45')#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.466 187212 INFO nova.virt.libvirt.driver [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Deleting instance files /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d_del#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.466 187212 INFO nova.virt.libvirt.driver [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Deletion of /var/lib/nova/instances/bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d_del complete#033[00m
Dec  5 07:01:01 np0005546909 podman[216276]: 2025-12-05 12:01:01.481962762 +0000 UTC m=+0.042340887 container remove 7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.487 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b462d86b-11d9-4ff0-ba68-512c44c360fe]: (4, ('Fri Dec  5 12:01:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397 (7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8)\n7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8\nFri Dec  5 12:01:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397 (7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8)\n7e9f57ccf7809873437b1ea91d214f9b2c2416c37655e3c95b6d2a377c994dc8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.491 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[450000b6-c77a-47e6-ab29-6f71b2bfc3b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.492 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16e72b69-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.494 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:01 np0005546909 kernel: tap16e72b69-f0: left promiscuous mode
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.507 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.510 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b968a65e-a087-4c5b-9294-376766b70d87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.522 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[07bac4f8-3eec-414e-aa18-3df6e6f82d30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.525 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[573d11c9-00b5-4614-baaa-8cc7a16be646]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.531 187212 INFO nova.compute.manager [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.532 187212 DEBUG oslo.service.loopingcall [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.532 187212 DEBUG nova.compute.manager [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:01:01 np0005546909 nova_compute[187208]: 2025-12-05 12:01:01.532 187212 DEBUG nova.network.neutron [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.549 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2a02adf4-5bdf-4022-a6c8-3cee1c455ce1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341004, 'reachable_time': 27199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216290, 'error': None, 'target': 'ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.551 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-16e72b69-f48e-48c4-b5b8-b2731e24f397 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:01:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:01.551 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7614eaeb-51a2-42ea-a94d-7055619c5978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:01 np0005546909 systemd[1]: run-netns-ovnmeta\x2d16e72b69\x2df48e\x2d48c4\x2db5b8\x2db2731e24f397.mount: Deactivated successfully.
Dec  5 07:01:02 np0005546909 nova_compute[187208]: 2025-12-05 12:01:02.297 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:02Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:e2:12 10.100.0.10
Dec  5 07:01:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:02Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:e2:12 10.100.0.10
Dec  5 07:01:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:03.008 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:03.008 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:03.009 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:03 np0005546909 nova_compute[187208]: 2025-12-05 12:01:03.443 187212 INFO nova.compute.manager [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Rebuilding instance#033[00m
Dec  5 07:01:03 np0005546909 nova_compute[187208]: 2025-12-05 12:01:03.763 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:03 np0005546909 nova_compute[187208]: 2025-12-05 12:01:03.856 187212 DEBUG nova.compute.manager [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.051 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_requests' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.067 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.081 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'resources' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.100 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'migration_context' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.114 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.120 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.170 187212 DEBUG nova.network.neutron [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.201 187212 INFO nova.compute.manager [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Took 2.67 seconds to deallocate network for instance.#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.281 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.282 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.478 187212 DEBUG nova.compute.provider_tree [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.492 187212 DEBUG nova.scheduler.client.report [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.520 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.545 187212 INFO nova.scheduler.client.report [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Deleted allocations for instance bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.622 187212 DEBUG oslo_concurrency.lockutils [None req-b0d9e730-bbc3-4e67-b390-44d2da9800a7 4aa579f9c54f43039ef96c870ed5e049 a211e57445104139baeb5ca8fa933c58 - - default default] Lock "bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.497s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:04 np0005546909 nova_compute[187208]: 2025-12-05 12:01:04.653 187212 DEBUG nova.compute.manager [req-fb136fdd-ec7a-43f9-907e-0b87babf8067 req-b80dbebe-e076-4d08-b51e-6b9db3c7f726 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Received event network-vif-deleted-e56fa29b-453e-4140-997d-96c0de8ed4bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:06 np0005546909 podman[216346]: 2025-12-05 12:01:06.238911398 +0000 UTC m=+0.087314430 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:01:06 np0005546909 kernel: tap380c99a7-94 (unregistering): left promiscuous mode
Dec  5 07:01:06 np0005546909 NetworkManager[55691]: <info>  [1764936066.3087] device (tap380c99a7-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:01:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:06Z|00101|binding|INFO|Releasing lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d from this chassis (sb_readonly=0)
Dec  5 07:01:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:06Z|00102|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d down in Southbound
Dec  5 07:01:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:06Z|00103|binding|INFO|Removing iface tap380c99a7-94 ovn-installed in OVS
Dec  5 07:01:06 np0005546909 nova_compute[187208]: 2025-12-05 12:01:06.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:06 np0005546909 nova_compute[187208]: 2025-12-05 12:01:06.318 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.328 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:4f:38 10.100.0.13'], port_security=['fa:16:3e:24:4f:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=380c99a7-9480-45f8-b2f4-adfcdfa8576d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.330 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 380c99a7-9480-45f8-b2f4-adfcdfa8576d in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.333 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:01:06 np0005546909 nova_compute[187208]: 2025-12-05 12:01:06.333 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.352 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[94e0bff3-72ea-41ff-8b0d-2b6c0fd2e0fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:06 np0005546909 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec  5 07:01:06 np0005546909 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 15.656s CPU time.
Dec  5 07:01:06 np0005546909 systemd-machined[153543]: Machine qemu-12-instance-0000000c terminated.
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.383 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[de5058d8-f1d0-49b5-ad1d-4ac753379db3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.387 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[54aaef84-d6f5-4187-add2-e267028817e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.409 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a14ea123-b513-44e1-9b35-d8fd009fe450]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.427 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[25d2f3b1-c4c0-45ab-b31c-76b0fa00e5e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216380, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.444 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6f7165c9-acbf-433f-a4dd-1e1ded287666]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216381, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216381, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.447 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:06 np0005546909 nova_compute[187208]: 2025-12-05 12:01:06.450 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:06 np0005546909 nova_compute[187208]: 2025-12-05 12:01:06.453 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.455 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.456 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.457 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:06.458 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:06 np0005546909 nova_compute[187208]: 2025-12-05 12:01:06.461 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.145 187212 INFO nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance shutdown successfully after 3 seconds.#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.155 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance destroyed successfully.#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.162 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance destroyed successfully.#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.164 187212 DEBUG nova.virt.libvirt.vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:01Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.165 187212 DEBUG nova.network.os_vif_util [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.166 187212 DEBUG nova.network.os_vif_util [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.168 187212 DEBUG os_vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.171 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.172 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap380c99a7-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.174 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.183 187212 INFO os_vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94')#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.184 187212 INFO nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deleting instance files /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4_del#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.186 187212 INFO nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deletion of /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4_del complete#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.223 187212 DEBUG nova.compute.manager [req-b037891e-25a7-4769-95b6-a4ad8322ae62 req-37a9d931-e9e6-4c4a-8354-65fdc91cdaba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.224 187212 DEBUG oslo_concurrency.lockutils [req-b037891e-25a7-4769-95b6-a4ad8322ae62 req-37a9d931-e9e6-4c4a-8354-65fdc91cdaba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.224 187212 DEBUG oslo_concurrency.lockutils [req-b037891e-25a7-4769-95b6-a4ad8322ae62 req-37a9d931-e9e6-4c4a-8354-65fdc91cdaba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.225 187212 DEBUG oslo_concurrency.lockutils [req-b037891e-25a7-4769-95b6-a4ad8322ae62 req-37a9d931-e9e6-4c4a-8354-65fdc91cdaba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.225 187212 DEBUG nova.compute.manager [req-b037891e-25a7-4769-95b6-a4ad8322ae62 req-37a9d931-e9e6-4c4a-8354-65fdc91cdaba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.225 187212 WARNING nova.compute.manager [req-b037891e-25a7-4769-95b6-a4ad8322ae62 req-37a9d931-e9e6-4c4a-8354-65fdc91cdaba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state error and task_state rebuilding.#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.301 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.412 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.413 187212 INFO nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating image(s)#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.414 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.415 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.416 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.439 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.494 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.496 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.497 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.513 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.547 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.567 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.568 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.601 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.603 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.603 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.669 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.671 187212 DEBUG nova.virt.disk.api [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Checking if we can resize image /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.671 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.715 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "ab127619-9b81-4800-a347-5747dd062e5e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.716 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "ab127619-9b81-4800-a347-5747dd062e5e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.737 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.743 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.744 187212 DEBUG nova.virt.disk.api [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Cannot resize image /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.745 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.745 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Ensure instance console log exists: /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.746 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.746 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.746 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.749 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start _get_guest_xml network_info=[{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.753 187212 WARNING nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.757 187212 DEBUG nova.virt.libvirt.host [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.758 187212 DEBUG nova.virt.libvirt.host [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.761 187212 DEBUG nova.virt.libvirt.host [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.761 187212 DEBUG nova.virt.libvirt.host [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.762 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.762 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.762 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.763 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.763 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.763 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.764 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.764 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.764 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.764 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.765 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.765 187212 DEBUG nova.virt.hardware [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.765 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.795 187212 DEBUG nova.virt.libvirt.vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:07Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.796 187212 DEBUG nova.network.os_vif_util [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.796 187212 DEBUG nova.network.os_vif_util [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.798 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  <uuid>982a8e69-5181-4847-bdfe-8d4de12bb2e4</uuid>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  <name>instance-0000000c</name>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersAdminTestJSON-server-1785289561</nova:name>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:01:07</nova:creationTime>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:        <nova:user uuid="1ac3c267120a4aeaa91f472943c4e1e2">tempest-ServersAdminTestJSON-715947304-project-member</nova:user>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:        <nova:project uuid="98815fe6b9ea4988abc2cccd9726dc86">tempest-ServersAdminTestJSON-715947304</nova:project>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:        <nova:port uuid="380c99a7-9480-45f8-b2f4-adfcdfa8576d">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <entry name="serial">982a8e69-5181-4847-bdfe-8d4de12bb2e4</entry>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <entry name="uuid">982a8e69-5181-4847-bdfe-8d4de12bb2e4</entry>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:24:4f:38"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <target dev="tap380c99a7-94"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/console.log" append="off"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:01:07 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:01:07 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:01:07 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:01:07 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.803 187212 DEBUG nova.compute.manager [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Preparing to wait for external event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.803 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.803 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.803 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.804 187212 DEBUG nova.virt.libvirt.vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:07Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.805 187212 DEBUG nova.network.os_vif_util [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.805 187212 DEBUG nova.network.os_vif_util [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.806 187212 DEBUG os_vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.807 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.807 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.808 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.814 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.814 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap380c99a7-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.815 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap380c99a7-94, col_values=(('external_ids', {'iface-id': '380c99a7-9480-45f8-b2f4-adfcdfa8576d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:4f:38', 'vm-uuid': '982a8e69-5181-4847-bdfe-8d4de12bb2e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.819 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:07 np0005546909 NetworkManager[55691]: <info>  [1764936067.8199] manager: (tap380c99a7-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.825 187212 INFO os_vif [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94')#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.842 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.842 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.848 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.848 187212 INFO nova.compute.claims [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.916 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.916 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.917 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No VIF found with MAC fa:16:3e:24:4f:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.917 187212 INFO nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Using config drive#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.938 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:07 np0005546909 nova_compute[187208]: 2025-12-05 12:01:07.964 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'keypairs' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.097 187212 DEBUG nova.compute.provider_tree [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.112 187212 DEBUG nova.scheduler.client.report [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.133 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.133 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.194 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.195 187212 DEBUG nova.network.neutron [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.214 187212 INFO nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.233 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.334 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.337 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.337 187212 INFO nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Creating image(s)#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.338 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "/var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.338 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "/var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.339 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "/var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.351 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.425 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.426 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.427 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.444 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.502 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.503 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.623 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk 1073741824" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.625 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.625 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.681 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.682 187212 DEBUG nova.virt.disk.api [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Checking if we can resize image /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.683 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.738 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.739 187212 DEBUG nova.virt.disk.api [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Cannot resize image /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.739 187212 DEBUG nova.objects.instance [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lazy-loading 'migration_context' on Instance uuid ab127619-9b81-4800-a347-5747dd062e5e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.756 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.757 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Ensure instance console log exists: /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.757 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.758 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.758 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.784 187212 INFO nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating config drive at /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.790 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc0_8e_8p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.874 187212 DEBUG nova.network.neutron [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.875 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.876 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.880 187212 WARNING nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.884 187212 DEBUG nova.virt.libvirt.host [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.884 187212 DEBUG nova.virt.libvirt.host [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.887 187212 DEBUG nova.virt.libvirt.host [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.887 187212 DEBUG nova.virt.libvirt.host [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.888 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.888 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.888 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.889 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.889 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.889 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.889 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.889 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.890 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.890 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.890 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.891 187212 DEBUG nova.virt.hardware [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.895 187212 DEBUG nova.objects.instance [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lazy-loading 'pci_devices' on Instance uuid ab127619-9b81-4800-a347-5747dd062e5e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.913 187212 DEBUG oslo_concurrency.processutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc0_8e_8p" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.918 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  <uuid>ab127619-9b81-4800-a347-5747dd062e5e</uuid>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  <name>instance-00000016</name>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <nova:name>tempest-TenantUsagesTestJSON-server-1464755080</nova:name>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:01:08</nova:creationTime>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:01:08 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:        <nova:user uuid="4f0702685eba47f39b88602e4d1f00cc">tempest-TenantUsagesTestJSON-1778758271-project-member</nova:user>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:        <nova:project uuid="02988f772510450db9a7b8c5bd4b0dc7">tempest-TenantUsagesTestJSON-1778758271</nova:project>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <entry name="serial">ab127619-9b81-4800-a347-5747dd062e5e</entry>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <entry name="uuid">ab127619-9b81-4800-a347-5747dd062e5e</entry>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.config"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/console.log" append="off"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:01:08 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:01:08 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:01:08 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:01:08 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.991 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.992 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:08 np0005546909 nova_compute[187208]: 2025-12-05 12:01:08.992 187212 INFO nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Using config drive#033[00m
Dec  5 07:01:08 np0005546909 kernel: tap380c99a7-94: entered promiscuous mode
Dec  5 07:01:08 np0005546909 systemd-udevd[216373]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:01:08 np0005546909 NetworkManager[55691]: <info>  [1764936068.9994] manager: (tap380c99a7-94): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Dec  5 07:01:09 np0005546909 NetworkManager[55691]: <info>  [1764936069.0457] device (tap380c99a7-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:01:09 np0005546909 NetworkManager[55691]: <info>  [1764936069.0467] device (tap380c99a7-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:01:09 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:09Z|00104|binding|INFO|Claiming lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d for this chassis.
Dec  5 07:01:09 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:09Z|00105|binding|INFO|380c99a7-9480-45f8-b2f4-adfcdfa8576d: Claiming fa:16:3e:24:4f:38 10.100.0.13
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.048 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.057 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:4f:38 10.100.0.13'], port_security=['fa:16:3e:24:4f:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=380c99a7-9480-45f8-b2f4-adfcdfa8576d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.057 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 380c99a7-9480-45f8-b2f4-adfcdfa8576d in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.059 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:01:09 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:09Z|00106|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d ovn-installed in OVS
Dec  5 07:01:09 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:09Z|00107|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d up in Southbound
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.065 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.076 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0d6adbce-de82-4639-a534-48f96f8a7174]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:09 np0005546909 systemd-machined[153543]: New machine qemu-23-instance-0000000c.
Dec  5 07:01:09 np0005546909 systemd[1]: Started Virtual Machine qemu-23-instance-0000000c.
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.108 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a499ca-d8fe-449f-be28-617e9ac36ae9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.111 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6491c5c4-8708-4ef2-b5a6-615c26953b04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.134 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b8737174-48f8-42a7-8280-8792e05caf29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.162 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[40b3a960-9c87-4d74-8828-28ce10ca33da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 13, 'rx_bytes': 868, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 13, 'rx_bytes': 868, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216480, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.176 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72426923-11e8-4384-a2c8-8b5f31c6f734]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216481, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216481, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.177 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.179 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.180 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.180 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.180 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:09.180 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.352 187212 DEBUG nova.compute.manager [req-6125d50a-861d-4fe4-8292-2c3c40ee1cbc req-f2632571-c572-44db-92d1-7d7d1d1e49e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.352 187212 DEBUG oslo_concurrency.lockutils [req-6125d50a-861d-4fe4-8292-2c3c40ee1cbc req-f2632571-c572-44db-92d1-7d7d1d1e49e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.352 187212 DEBUG oslo_concurrency.lockutils [req-6125d50a-861d-4fe4-8292-2c3c40ee1cbc req-f2632571-c572-44db-92d1-7d7d1d1e49e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.352 187212 DEBUG oslo_concurrency.lockutils [req-6125d50a-861d-4fe4-8292-2c3c40ee1cbc req-f2632571-c572-44db-92d1-7d7d1d1e49e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.353 187212 DEBUG nova.compute.manager [req-6125d50a-861d-4fe4-8292-2c3c40ee1cbc req-f2632571-c572-44db-92d1-7d7d1d1e49e4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Processing event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:01:09 np0005546909 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000012.scope: Deactivated successfully.
Dec  5 07:01:09 np0005546909 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000012.scope: Consumed 12.409s CPU time.
Dec  5 07:01:09 np0005546909 systemd-machined[153543]: Machine qemu-21-instance-00000012 terminated.
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.782 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 982a8e69-5181-4847-bdfe-8d4de12bb2e4 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.782 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936069.7814522, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.782 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Started (Lifecycle Event)#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.784 187212 DEBUG nova.compute.manager [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.787 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.792 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance spawned successfully.#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.792 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.800 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.803 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.809 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.809 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.810 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.810 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.810 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.811 187212 DEBUG nova.virt.libvirt.driver [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.830 187212 INFO nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Creating config drive at /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.config#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.835 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpme1xpuld execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.853 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.854 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936069.7824657, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.854 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.876 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.880 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936069.7875993, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.880 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.889 187212 DEBUG nova.compute.manager [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.898 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.900 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.926 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.929 187212 DEBUG nova.compute.manager [None req-34cd94ef-e927-48aa-b044-25943cc30bf6 a19dd465f6924cd8a015d5ec028d2e21 4ee519a917b34232836644d7ed32c09c - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.942 187212 INFO nova.compute.manager [None req-34cd94ef-e927-48aa-b044-25943cc30bf6 a19dd465f6924cd8a015d5ec028d2e21 4ee519a917b34232836644d7ed32c09c - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Retrieving diagnostics#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.952 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.953 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.953 187212 DEBUG nova.objects.instance [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:01:09 np0005546909 nova_compute[187208]: 2025-12-05 12:01:09.960 187212 DEBUG oslo_concurrency.processutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpme1xpuld" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:10 np0005546909 systemd-machined[153543]: New machine qemu-24-instance-00000016.
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.028 187212 DEBUG oslo_concurrency.lockutils [None req-f51b0bae-c513-4b99-8eab-c2dd3e413cd7 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:10 np0005546909 systemd[1]: Started Virtual Machine qemu-24-instance-00000016.
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.173 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "897abc63-6217-4009-a547-8799c4621feb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.174 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "897abc63-6217-4009-a547-8799c4621feb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.174 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "897abc63-6217-4009-a547-8799c4621feb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.174 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "897abc63-6217-4009-a547-8799c4621feb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.175 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "897abc63-6217-4009-a547-8799c4621feb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.176 187212 INFO nova.compute.manager [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Terminating instance#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.177 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "refresh_cache-897abc63-6217-4009-a547-8799c4621feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.177 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquired lock "refresh_cache-897abc63-6217-4009-a547-8799c4621feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.177 187212 DEBUG nova.network.neutron [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.333 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.368 187212 DEBUG nova.network.neutron [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.443 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.563 187212 INFO nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance shutdown successfully after 13 seconds.#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.570 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance destroyed successfully.#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.576 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance destroyed successfully.#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.577 187212 INFO nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deleting instance files /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932_del#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.577 187212 INFO nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deletion of /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932_del complete#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.735 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.736 187212 INFO nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating image(s)#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.736 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Acquiring lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.737 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.737 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.753 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.827 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.830 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.831 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.854 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.877 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936070.8656435, ab127619-9b81-4800-a347-5747dd062e5e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.878 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.884 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.885 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.888 187212 DEBUG nova.network.neutron [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.895 187212 INFO nova.virt.libvirt.driver [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Instance spawned successfully.#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.896 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.911 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.914 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Releasing lock "refresh_cache-897abc63-6217-4009-a547-8799c4621feb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.914 187212 DEBUG nova.compute.manager [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.919 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.921 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.943 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.954 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.955 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.956 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.957 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.958 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.959 187212 DEBUG nova.virt.libvirt.driver [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.964 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.966 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.967 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:10 np0005546909 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000015.scope: Deactivated successfully.
Dec  5 07:01:10 np0005546909 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000015.scope: Consumed 12.136s CPU time.
Dec  5 07:01:10 np0005546909 systemd-machined[153543]: Machine qemu-22-instance-00000015 terminated.
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.996 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.997 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936070.8686862, ab127619-9b81-4800-a347-5747dd062e5e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:10 np0005546909 nova_compute[187208]: 2025-12-05 12:01:10.997 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] VM Started (Lifecycle Event)#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.021 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.023 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.023 187212 DEBUG nova.virt.disk.api [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Checking if we can resize image /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.023 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.048 187212 INFO nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Took 2.71 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.049 187212 DEBUG nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.055 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.083 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.085 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.085 187212 DEBUG nova.virt.disk.api [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Cannot resize image /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.086 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.086 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Ensure instance console log exists: /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.087 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.087 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.088 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.089 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.094 187212 WARNING nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.099 187212 DEBUG nova.virt.libvirt.host [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.099 187212 DEBUG nova.virt.libvirt.host [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.102 187212 DEBUG nova.virt.libvirt.host [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.103 187212 DEBUG nova.virt.libvirt.host [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.103 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.103 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.104 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.104 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.104 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.105 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.105 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.105 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.105 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.106 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.106 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.106 187212 DEBUG nova.virt.hardware [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.106 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.118 187212 INFO nova.compute.manager [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Took 3.31 seconds to build instance.#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.123 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  <uuid>52d63666-4caa-4eaa-9128-6e21189b0932</uuid>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  <name>instance-00000012</name>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersAdmin275Test-server-1823558123</nova:name>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:01:11</nova:creationTime>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:01:11 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:        <nova:user uuid="3a90749503e34bda87974b2c22626de0">tempest-ServersAdmin275Test-1624449796-project-member</nova:user>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:        <nova:project uuid="6d28e47b844b47238fb8386dae6c546e">tempest-ServersAdmin275Test-1624449796</nova:project>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <entry name="serial">52d63666-4caa-4eaa-9128-6e21189b0932</entry>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <entry name="uuid">52d63666-4caa-4eaa-9128-6e21189b0932</entry>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/console.log" append="off"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:01:11 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:01:11 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:01:11 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:01:11 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.139 187212 DEBUG oslo_concurrency.lockutils [None req-93cdf8a4-7ee9-4125-9fc4-b0f47ab52b1e 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "ab127619-9b81-4800-a347-5747dd062e5e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.198 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.198 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.199 187212 INFO nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Using config drive#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.204 187212 INFO nova.virt.libvirt.driver [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] Instance destroyed successfully.#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.204 187212 DEBUG nova.objects.instance [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lazy-loading 'resources' on Instance uuid 897abc63-6217-4009-a547-8799c4621feb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.232 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.236 187212 INFO nova.virt.libvirt.driver [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Deleting instance files /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb_del#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.237 187212 INFO nova.virt.libvirt.driver [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Deletion of /var/lib/nova/instances/897abc63-6217-4009-a547-8799c4621feb_del complete#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.263 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lazy-loading 'keypairs' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.315 187212 INFO nova.compute.manager [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] [instance: 897abc63-6217-4009-a547-8799c4621feb] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.315 187212 DEBUG oslo.service.loopingcall [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.316 187212 DEBUG nova.compute.manager [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.316 187212 DEBUG nova.network.neutron [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.384 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936056.380033, c5241646-e089-40a3-b197-60aff60ea075 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.385 187212 INFO nova.compute.manager [-] [instance: c5241646-e089-40a3-b197-60aff60ea075] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.437 187212 DEBUG nova.compute.manager [None req-eb546a3e-1030-4128-a0a9-8ba433345284 - - - - - -] [instance: c5241646-e089-40a3-b197-60aff60ea075] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.453 187212 INFO nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Creating config drive at /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.458 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx7abbxh_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.596 187212 DEBUG nova.compute.manager [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.597 187212 DEBUG oslo_concurrency.lockutils [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.599 187212 DEBUG oslo_concurrency.lockutils [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.599 187212 DEBUG oslo_concurrency.lockutils [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.600 187212 DEBUG nova.compute.manager [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.600 187212 WARNING nova.compute.manager [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state active and task_state None.#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.601 187212 DEBUG nova.compute.manager [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.601 187212 DEBUG oslo_concurrency.lockutils [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.602 187212 DEBUG oslo_concurrency.lockutils [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.602 187212 DEBUG oslo_concurrency.lockutils [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.602 187212 DEBUG nova.compute.manager [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.603 187212 WARNING nova.compute.manager [req-d17b8371-4122-4408-91ca-e1f835c22da2 req-f2b7853b-a2d9-44f6-aeed-882d3f9feb56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state active and task_state None.#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.604 187212 DEBUG oslo_concurrency.processutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx7abbxh_" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:11 np0005546909 systemd-machined[153543]: New machine qemu-25-instance-00000012.
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.689 187212 DEBUG nova.network.neutron [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.701 187212 DEBUG nova.network.neutron [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:11 np0005546909 systemd[1]: Started Virtual Machine qemu-25-instance-00000012.
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.714 187212 INFO nova.compute.manager [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] Took 0.40 seconds to deallocate network for instance.#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.766 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.766 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.953 187212 DEBUG nova.compute.provider_tree [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.967 187212 DEBUG nova.scheduler.client.report [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:01:11 np0005546909 nova_compute[187208]: 2025-12-05 12:01:11.986 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.029 187212 INFO nova.scheduler.client.report [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Deleted allocations for instance 897abc63-6217-4009-a547-8799c4621feb#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.103 187212 DEBUG oslo_concurrency.lockutils [None req-795cdb2e-6302-4907-ad1c-bb91c3614ed6 3777f30c4e2e4644912c2ef76a3ea2c0 7a8c57ca06ea434e98ac6900d68e5c27 - - default default] Lock "897abc63-6217-4009-a547-8799c4621feb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:12Z|00108|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.183 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.362 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:12Z|00109|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.388 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 52d63666-4caa-4eaa-9128-6e21189b0932 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.389 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936072.3881116, 52d63666-4caa-4eaa-9128-6e21189b0932 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.389 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.392 187212 DEBUG nova.compute.manager [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.392 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.393 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.397 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance spawned successfully.#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.397 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.608 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.612 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.612 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.613 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.613 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.613 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.614 187212 DEBUG nova.virt.libvirt.driver [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.620 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.699 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.699 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936072.3907473, 52d63666-4caa-4eaa-9128-6e21189b0932 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.699 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Started (Lifecycle Event)#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.723 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.726 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.732 187212 DEBUG nova.compute.manager [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.754 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.783 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.784 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.784 187212 DEBUG nova.objects.instance [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.819 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:12 np0005546909 nova_compute[187208]: 2025-12-05 12:01:12.834 187212 DEBUG oslo_concurrency.lockutils [None req-8efcdc52-1d25-4e91-8649-c0ee34f0065a 1e5566fbd86d453ba09f39076d9d7ce1 b88cf9ba1eea432685589a63a80e95a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:13 np0005546909 nova_compute[187208]: 2025-12-05 12:01:13.009 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "ab127619-9b81-4800-a347-5747dd062e5e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:13 np0005546909 nova_compute[187208]: 2025-12-05 12:01:13.010 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "ab127619-9b81-4800-a347-5747dd062e5e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:13 np0005546909 nova_compute[187208]: 2025-12-05 12:01:13.010 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "ab127619-9b81-4800-a347-5747dd062e5e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:13 np0005546909 nova_compute[187208]: 2025-12-05 12:01:13.010 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "ab127619-9b81-4800-a347-5747dd062e5e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:13 np0005546909 nova_compute[187208]: 2025-12-05 12:01:13.011 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "ab127619-9b81-4800-a347-5747dd062e5e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:13 np0005546909 nova_compute[187208]: 2025-12-05 12:01:13.012 187212 INFO nova.compute.manager [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Terminating instance#033[00m
Dec  5 07:01:13 np0005546909 nova_compute[187208]: 2025-12-05 12:01:13.013 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "refresh_cache-ab127619-9b81-4800-a347-5747dd062e5e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:13 np0005546909 nova_compute[187208]: 2025-12-05 12:01:13.013 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquired lock "refresh_cache-ab127619-9b81-4800-a347-5747dd062e5e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:13 np0005546909 nova_compute[187208]: 2025-12-05 12:01:13.013 187212 DEBUG nova.network.neutron [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:01:13 np0005546909 nova_compute[187208]: 2025-12-05 12:01:13.213 187212 DEBUG nova.network.neutron [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.020 187212 DEBUG nova.network.neutron [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.049 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Releasing lock "refresh_cache-ab127619-9b81-4800-a347-5747dd062e5e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.049 187212 DEBUG nova.compute.manager [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:01:14 np0005546909 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000016.scope: Deactivated successfully.
Dec  5 07:01:14 np0005546909 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000016.scope: Consumed 3.982s CPU time.
Dec  5 07:01:14 np0005546909 systemd-machined[153543]: Machine qemu-24-instance-00000016 terminated.
Dec  5 07:01:14 np0005546909 podman[216578]: 2025-12-05 12:01:14.161642457 +0000 UTC m=+0.072263006 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.304 187212 INFO nova.virt.libvirt.driver [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Instance destroyed successfully.#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.305 187212 DEBUG nova.objects.instance [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lazy-loading 'resources' on Instance uuid ab127619-9b81-4800-a347-5747dd062e5e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.324 187212 INFO nova.virt.libvirt.driver [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Deleting instance files /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e_del#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.324 187212 INFO nova.virt.libvirt.driver [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Deletion of /var/lib/nova/instances/ab127619-9b81-4800-a347-5747dd062e5e_del complete#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.622 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "52d63666-4caa-4eaa-9128-6e21189b0932" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.623 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "52d63666-4caa-4eaa-9128-6e21189b0932" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.623 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "52d63666-4caa-4eaa-9128-6e21189b0932-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.623 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "52d63666-4caa-4eaa-9128-6e21189b0932-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.623 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "52d63666-4caa-4eaa-9128-6e21189b0932-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.624 187212 INFO nova.compute.manager [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Terminating instance#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.625 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "refresh_cache-52d63666-4caa-4eaa-9128-6e21189b0932" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.626 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquired lock "refresh_cache-52d63666-4caa-4eaa-9128-6e21189b0932" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.627 187212 DEBUG nova.network.neutron [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.633 187212 INFO nova.compute.manager [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Took 0.58 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.635 187212 DEBUG oslo.service.loopingcall [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.635 187212 DEBUG nova.compute.manager [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.636 187212 DEBUG nova.network.neutron [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.684 187212 INFO nova.compute.manager [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Rebuilding instance#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.847 187212 DEBUG nova.network.neutron [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.863 187212 DEBUG nova.network.neutron [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.871 187212 DEBUG nova.network.neutron [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.881 187212 INFO nova.compute.manager [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Took 0.25 seconds to deallocate network for instance.#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.920 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.920 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.929 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:14 np0005546909 nova_compute[187208]: 2025-12-05 12:01:14.956 187212 DEBUG nova.compute.manager [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.002 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_requests' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.019 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.040 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'resources' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.061 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'migration_context' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.098 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.102 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.127 187212 DEBUG nova.compute.provider_tree [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.155 187212 DEBUG nova.scheduler.client.report [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.180 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.207 187212 INFO nova.scheduler.client.report [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Deleted allocations for instance ab127619-9b81-4800-a347-5747dd062e5e#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.311 187212 DEBUG oslo_concurrency.lockutils [None req-37c2dea6-a61a-4874-84d9-d5c4b907a2d1 4f0702685eba47f39b88602e4d1f00cc 02988f772510450db9a7b8c5bd4b0dc7 - - default default] Lock "ab127619-9b81-4800-a347-5747dd062e5e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.544 187212 DEBUG nova.network.neutron [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.565 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Releasing lock "refresh_cache-52d63666-4caa-4eaa-9128-6e21189b0932" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.565 187212 DEBUG nova.compute.manager [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:01:15 np0005546909 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000012.scope: Deactivated successfully.
Dec  5 07:01:15 np0005546909 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000012.scope: Consumed 3.795s CPU time.
Dec  5 07:01:15 np0005546909 systemd-machined[153543]: Machine qemu-25-instance-00000012 terminated.
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.816 187212 INFO nova.virt.libvirt.driver [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance destroyed successfully.#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.817 187212 DEBUG nova.objects.instance [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lazy-loading 'resources' on Instance uuid 52d63666-4caa-4eaa-9128-6e21189b0932 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.833 187212 INFO nova.virt.libvirt.driver [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deleting instance files /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932_del#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.834 187212 INFO nova.virt.libvirt.driver [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deletion of /var/lib/nova/instances/52d63666-4caa-4eaa-9128-6e21189b0932_del complete#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.894 187212 INFO nova.compute.manager [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.895 187212 DEBUG oslo.service.loopingcall [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.895 187212 DEBUG nova.compute.manager [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:01:15 np0005546909 nova_compute[187208]: 2025-12-05 12:01:15.895 187212 DEBUG nova.network.neutron [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.182 187212 DEBUG nova.network.neutron [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.197 187212 DEBUG nova.network.neutron [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.219 187212 INFO nova.compute.manager [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Took 0.32 seconds to deallocate network for instance.#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.265 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.266 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.423 187212 DEBUG nova.compute.provider_tree [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.430 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936061.429443, bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.430 187212 INFO nova.compute.manager [-] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.442 187212 DEBUG nova.scheduler.client.report [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.452 187212 DEBUG nova.compute.manager [None req-a9e6b9b7-cc2a-4a0c-a21b-2907dda973f3 - - - - - -] [instance: bcfc1cd6-e0b9-4772-9ce5-80cd8563e14d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.468 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.498 187212 INFO nova.scheduler.client.report [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Deleted allocations for instance 52d63666-4caa-4eaa-9128-6e21189b0932#033[00m
Dec  5 07:01:16 np0005546909 nova_compute[187208]: 2025-12-05 12:01:16.566 187212 DEBUG oslo_concurrency.lockutils [None req-f8e6ad91-5ddf-448d-918c-389054da9d4c 3a90749503e34bda87974b2c22626de0 6d28e47b844b47238fb8386dae6c546e - - default default] Lock "52d63666-4caa-4eaa-9128-6e21189b0932" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:17 np0005546909 nova_compute[187208]: 2025-12-05 12:01:17.365 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:17 np0005546909 nova_compute[187208]: 2025-12-05 12:01:17.820 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:19 np0005546909 podman[216615]: 2025-12-05 12:01:19.205063033 +0000 UTC m=+0.054366355 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:01:20 np0005546909 podman[216634]: 2025-12-05 12:01:20.214402537 +0000 UTC m=+0.059991325 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, name=ubi9-minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350)
Dec  5 07:01:22 np0005546909 nova_compute[187208]: 2025-12-05 12:01:22.367 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:22Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:4f:38 10.100.0.13
Dec  5 07:01:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:22Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:4f:38 10.100.0.13
Dec  5 07:01:22 np0005546909 nova_compute[187208]: 2025-12-05 12:01:22.822 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:24 np0005546909 nova_compute[187208]: 2025-12-05 12:01:24.772 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:24 np0005546909 nova_compute[187208]: 2025-12-05 12:01:24.772 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:24 np0005546909 nova_compute[187208]: 2025-12-05 12:01:24.802 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:01:24 np0005546909 nova_compute[187208]: 2025-12-05 12:01:24.879 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:24 np0005546909 nova_compute[187208]: 2025-12-05 12:01:24.880 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:24 np0005546909 nova_compute[187208]: 2025-12-05 12:01:24.885 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:01:24 np0005546909 nova_compute[187208]: 2025-12-05 12:01:24.886 187212 INFO nova.compute.claims [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.044 187212 DEBUG nova.compute.provider_tree [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.065 187212 DEBUG nova.scheduler.client.report [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.097 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.098 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.146 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.163 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.164 187212 DEBUG nova.network.neutron [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.180 187212 INFO nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.197 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:01:25 np0005546909 podman[216666]: 2025-12-05 12:01:25.209875534 +0000 UTC m=+0.055912099 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec  5 07:01:25 np0005546909 podman[216667]: 2025-12-05 12:01:25.242387722 +0000 UTC m=+0.086865162 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.293 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.294 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.295 187212 INFO nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Creating image(s)#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.295 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.296 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.296 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.313 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.368 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.370 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.370 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.386 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.442 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.443 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.479 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.480 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.480 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.543 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.544 187212 DEBUG nova.virt.disk.api [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Checking if we can resize image /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.545 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.601 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.602 187212 DEBUG nova.virt.disk.api [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Cannot resize image /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.602 187212 DEBUG nova.objects.instance [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lazy-loading 'migration_context' on Instance uuid 1282e776-5758-493b-8f52-59839ebcd31b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.619 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.620 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Ensure instance console log exists: /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.620 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.620 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:25 np0005546909 nova_compute[187208]: 2025-12-05 12:01:25.621 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:26 np0005546909 nova_compute[187208]: 2025-12-05 12:01:26.202 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936071.200889, 897abc63-6217-4009-a547-8799c4621feb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:26 np0005546909 nova_compute[187208]: 2025-12-05 12:01:26.202 187212 INFO nova.compute.manager [-] [instance: 897abc63-6217-4009-a547-8799c4621feb] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:01:26 np0005546909 nova_compute[187208]: 2025-12-05 12:01:26.226 187212 DEBUG nova.compute.manager [None req-2efd0d07-15d0-491a-ae36-de792470b97c - - - - - -] [instance: 897abc63-6217-4009-a547-8799c4621feb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:26 np0005546909 nova_compute[187208]: 2025-12-05 12:01:26.567 187212 DEBUG nova.policy [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '496da6872d53413ea1c201178cf5b05c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8400e354e93c4b33b8d683012dfe5c94', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.022 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.023 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.042 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.120 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.120 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.125 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.126 187212 INFO nova.compute.claims [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.318 187212 DEBUG nova.compute.provider_tree [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:01:27 np0005546909 kernel: tap380c99a7-94 (unregistering): left promiscuous mode
Dec  5 07:01:27 np0005546909 NetworkManager[55691]: <info>  [1764936087.3256] device (tap380c99a7-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.332 187212 DEBUG nova.scheduler.client.report [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.336 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:27 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:27Z|00110|binding|INFO|Releasing lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d from this chassis (sb_readonly=0)
Dec  5 07:01:27 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:27Z|00111|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d down in Southbound
Dec  5 07:01:27 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:27Z|00112|binding|INFO|Removing iface tap380c99a7-94 ovn-installed in OVS
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.342 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:4f:38 10.100.0.13'], port_security=['fa:16:3e:24:4f:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=380c99a7-9480-45f8-b2f4-adfcdfa8576d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.343 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 380c99a7-9480-45f8-b2f4-adfcdfa8576d in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis#033[00m
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.345 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.353 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.358 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.358 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.359 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[06c5745d-da48-419b-ab0a-3f434b3961c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.369 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.387 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[dd709adb-d972-4835-9ee9-de601c64aeb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.390 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7910f3eb-70e1-4442-b1e0-91816445abff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:27 np0005546909 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec  5 07:01:27 np0005546909 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000000c.scope: Consumed 13.278s CPU time.
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.403 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.403 187212 DEBUG nova.network.neutron [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:01:27 np0005546909 systemd-machined[153543]: Machine qemu-23-instance-0000000c terminated.
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.418 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7c0eeac5-8a07-4e67-b517-b15d415b471d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.425 187212 INFO nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.435 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dae07921-0ac0-42c4-9adb-ada934a19cb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 15, 'rx_bytes': 868, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 15, 'rx_bytes': 868, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216744, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.442 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.449 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[477d79e5-0f0e-4f95-94a4-713f49e7be03]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216745, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216745, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.450 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.451 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.456 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.456 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.457 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.457 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:27.458 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.540 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.542 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.542 187212 INFO nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Creating image(s)#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.543 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "/var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.543 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "/var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.544 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "/var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.561 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:27 np0005546909 kernel: tap380c99a7-94: entered promiscuous mode
Dec  5 07:01:27 np0005546909 kernel: tap380c99a7-94 (unregistering): left promiscuous mode
Dec  5 07:01:27 np0005546909 NetworkManager[55691]: <info>  [1764936087.5783] manager: (tap380c99a7-94): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.581 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.637 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.638 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.639 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.658 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.679 187212 DEBUG nova.policy [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a5b1ecad65045afbe3c154494417765', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2c184f0f2b71412fb560981314d0574d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.712 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.712 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.744 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.745 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.746 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.809 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.810 187212 DEBUG nova.virt.disk.api [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Checking if we can resize image /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.811 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.864 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.882 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.883 187212 DEBUG nova.virt.disk.api [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Cannot resize image /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:01:27 np0005546909 nova_compute[187208]: 2025-12-05 12:01:27.883 187212 DEBUG nova.objects.instance [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lazy-loading 'migration_context' on Instance uuid 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.044 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.044 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Ensure instance console log exists: /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.045 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.045 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.045 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.159 187212 INFO nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance shutdown successfully after 13 seconds.#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.165 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance destroyed successfully.#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.173 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance destroyed successfully.#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.174 187212 DEBUG nova.virt.libvirt.vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:13Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.175 187212 DEBUG nova.network.os_vif_util [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.176 187212 DEBUG nova.network.os_vif_util [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.176 187212 DEBUG os_vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.179 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.180 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap380c99a7-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.185 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.188 187212 INFO os_vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94')#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.188 187212 INFO nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deleting instance files /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4_del#033[00m
Dec  5 07:01:28 np0005546909 nova_compute[187208]: 2025-12-05 12:01:28.189 187212 INFO nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deletion of /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4_del complete#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.303 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936074.3023188, ab127619-9b81-4800-a347-5747dd062e5e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.303 187212 INFO nova.compute.manager [-] [instance: ab127619-9b81-4800-a347-5747dd062e5e] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.402 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.403 187212 INFO nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating image(s)#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.404 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.404 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.404 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.418 187212 DEBUG nova.compute.manager [None req-e990e7c0-5100-47e3-bc3d-2faddc29fc81 - - - - - -] [instance: ab127619-9b81-4800-a347-5747dd062e5e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.420 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.484 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.485 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.486 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.502 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.559 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.560 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.595 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.596 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.596 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.651 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.652 187212 DEBUG nova.virt.disk.api [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Checking if we can resize image /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.653 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.707 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.708 187212 DEBUG nova.virt.disk.api [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Cannot resize image /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.709 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.709 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Ensure instance console log exists: /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.709 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.710 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.710 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.712 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start _get_guest_xml network_info=[{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.717 187212 WARNING nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.729 187212 DEBUG nova.virt.libvirt.host [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.730 187212 DEBUG nova.virt.libvirt.host [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.732 187212 DEBUG nova.virt.libvirt.host [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.733 187212 DEBUG nova.virt.libvirt.host [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.733 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.734 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.734 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.735 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.735 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.735 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.735 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.736 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.736 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.736 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.736 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.736 187212 DEBUG nova.virt.hardware [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.737 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.756 187212 DEBUG nova.virt.libvirt.vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:28Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.756 187212 DEBUG nova.network.os_vif_util [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.757 187212 DEBUG nova.network.os_vif_util [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.759 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  <uuid>982a8e69-5181-4847-bdfe-8d4de12bb2e4</uuid>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  <name>instance-0000000c</name>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersAdminTestJSON-server-1785289561</nova:name>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:01:29</nova:creationTime>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:        <nova:user uuid="1ac3c267120a4aeaa91f472943c4e1e2">tempest-ServersAdminTestJSON-715947304-project-member</nova:user>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:        <nova:project uuid="98815fe6b9ea4988abc2cccd9726dc86">tempest-ServersAdminTestJSON-715947304</nova:project>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:        <nova:port uuid="380c99a7-9480-45f8-b2f4-adfcdfa8576d">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <entry name="serial">982a8e69-5181-4847-bdfe-8d4de12bb2e4</entry>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <entry name="uuid">982a8e69-5181-4847-bdfe-8d4de12bb2e4</entry>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:24:4f:38"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <target dev="tap380c99a7-94"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/console.log" append="off"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:01:29 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:01:29 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:01:29 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:01:29 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.760 187212 DEBUG nova.compute.manager [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Preparing to wait for external event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.760 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.761 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.761 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.762 187212 DEBUG nova.virt.libvirt.vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:28Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.762 187212 DEBUG nova.network.os_vif_util [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.762 187212 DEBUG nova.network.os_vif_util [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.763 187212 DEBUG os_vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.763 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.764 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.764 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.766 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.766 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap380c99a7-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.767 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap380c99a7-94, col_values=(('external_ids', {'iface-id': '380c99a7-9480-45f8-b2f4-adfcdfa8576d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:4f:38', 'vm-uuid': '982a8e69-5181-4847-bdfe-8d4de12bb2e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.768 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:29 np0005546909 NetworkManager[55691]: <info>  [1764936089.7695] manager: (tap380c99a7-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.771 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.774 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:29 np0005546909 nova_compute[187208]: 2025-12-05 12:01:29.774 187212 INFO os_vif [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94')#033[00m
Dec  5 07:01:29 np0005546909 podman[216791]: 2025-12-05 12:01:29.863814973 +0000 UTC m=+0.057130853 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:01:30 np0005546909 nova_compute[187208]: 2025-12-05 12:01:30.658 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:30 np0005546909 nova_compute[187208]: 2025-12-05 12:01:30.659 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:30 np0005546909 nova_compute[187208]: 2025-12-05 12:01:30.659 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] No VIF found with MAC fa:16:3e:24:4f:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:01:30 np0005546909 nova_compute[187208]: 2025-12-05 12:01:30.659 187212 INFO nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Using config drive#033[00m
Dec  5 07:01:30 np0005546909 nova_compute[187208]: 2025-12-05 12:01:30.815 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936075.814507, 52d63666-4caa-4eaa-9128-6e21189b0932 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:30 np0005546909 nova_compute[187208]: 2025-12-05 12:01:30.816 187212 INFO nova.compute.manager [-] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:01:31 np0005546909 nova_compute[187208]: 2025-12-05 12:01:31.284 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:31 np0005546909 nova_compute[187208]: 2025-12-05 12:01:31.287 187212 DEBUG nova.compute.manager [None req-6752171b-cb46-489d-9e0f-4c38d3f8bd91 - - - - - -] [instance: 52d63666-4caa-4eaa-9128-6e21189b0932] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:31 np0005546909 nova_compute[187208]: 2025-12-05 12:01:31.314 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'keypairs' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:31 np0005546909 nova_compute[187208]: 2025-12-05 12:01:31.415 187212 DEBUG nova.network.neutron [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Successfully created port: 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:01:31 np0005546909 nova_compute[187208]: 2025-12-05 12:01:31.566 187212 DEBUG nova.network.neutron [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Successfully created port: 02d6eab5-4561-4d9f-ad9a-169b57667224 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.372 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.491 187212 INFO nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Creating config drive at /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config#033[00m
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.496 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcd15zg5t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.617 187212 DEBUG oslo_concurrency.processutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcd15zg5t" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:32 np0005546909 kernel: tap380c99a7-94: entered promiscuous mode
Dec  5 07:01:32 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:32Z|00113|binding|INFO|Claiming lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d for this chassis.
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.671 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:32 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:32Z|00114|binding|INFO|380c99a7-9480-45f8-b2f4-adfcdfa8576d: Claiming fa:16:3e:24:4f:38 10.100.0.13
Dec  5 07:01:32 np0005546909 NetworkManager[55691]: <info>  [1764936092.6731] manager: (tap380c99a7-94): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.680 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:4f:38 10.100.0.13'], port_security=['fa:16:3e:24:4f:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=380c99a7-9480-45f8-b2f4-adfcdfa8576d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.681 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 380c99a7-9480-45f8-b2f4-adfcdfa8576d in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis#033[00m
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.683 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:01:32 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:32Z|00115|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d ovn-installed in OVS
Dec  5 07:01:32 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:32Z|00116|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d up in Southbound
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.686 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.689 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.698 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aface6b5-80ed-4c2b-8758-35fbf3f85d73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:32 np0005546909 systemd-udevd[216827]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:01:32 np0005546909 systemd-machined[153543]: New machine qemu-26-instance-0000000c.
Dec  5 07:01:32 np0005546909 NetworkManager[55691]: <info>  [1764936092.7200] device (tap380c99a7-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:01:32 np0005546909 NetworkManager[55691]: <info>  [1764936092.7207] device (tap380c99a7-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:01:32 np0005546909 systemd[1]: Started Virtual Machine qemu-26-instance-0000000c.
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.727 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[fbfb9359-d34b-48c9-bf40-223eba1cd30f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.731 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0e21bbb6-421c-4b46-afe6-07218da108e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.758 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[58c02809-d387-42bf-a269-5aff061144a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.775 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[253b4214-ec0b-4faa-a6b3-3adf8cdb1753]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 17, 'rx_bytes': 868, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 17, 'rx_bytes': 868, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216840, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.795 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d9f81c-b0bb-47d8-85d7-7414802dc22e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216842, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216842, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.797 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.799 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.800 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.803 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.803 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.804 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:32.804 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.970 187212 DEBUG nova.network.neutron [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Successfully updated port: 02d6eab5-4561-4d9f-ad9a-169b57667224 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.989 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.989 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquired lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:32 np0005546909 nova_compute[187208]: 2025-12-05 12:01:32.990 187212 DEBUG nova.network.neutron [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.179 187212 DEBUG nova.network.neutron [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.264 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 982a8e69-5181-4847-bdfe-8d4de12bb2e4 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.264 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936093.2632902, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.265 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Started (Lifecycle Event)#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.288 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.291 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936093.2644527, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.291 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.441 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.447 187212 DEBUG nova.compute.manager [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-changed-02d6eab5-4561-4d9f-ad9a-169b57667224 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.448 187212 DEBUG nova.compute.manager [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Refreshing instance network info cache due to event network-changed-02d6eab5-4561-4d9f-ad9a-169b57667224. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.448 187212 DEBUG oslo_concurrency.lockutils [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.453 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:33 np0005546909 nova_compute[187208]: 2025-12-05 12:01:33.482 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.196 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.196 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.218 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.354 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.355 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.371 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.372 187212 INFO nova.compute.claims [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.573 187212 DEBUG nova.network.neutron [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updating instance_info_cache with network_info: [{"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.605 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Releasing lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.605 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Instance network_info: |[{"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.606 187212 DEBUG oslo_concurrency.lockutils [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.606 187212 DEBUG nova.network.neutron [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Refreshing network info cache for port 02d6eab5-4561-4d9f-ad9a-169b57667224 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.609 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Start _get_guest_xml network_info=[{"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.612 187212 WARNING nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.615 187212 DEBUG nova.virt.libvirt.host [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.616 187212 DEBUG nova.virt.libvirt.host [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.619 187212 DEBUG nova.virt.libvirt.host [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.619 187212 DEBUG nova.virt.libvirt.host [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.620 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.620 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.620 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.620 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.620 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.621 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.621 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.621 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.621 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.621 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.621 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.622 187212 DEBUG nova.virt.hardware [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.624 187212 DEBUG nova.virt.libvirt.vif [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1046212835',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1046212835',id=24,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2c184f0f2b71412fb560981314d0574d',ramdisk_id='',reservation_id='r-gepf0n33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1975383464',owner_user_name='tempest-AttachInterfacesV270Test-1975383464-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:27Z,user_data=None,user_id='9a5b1ecad65045afbe3c154494417765',uuid=7b8cf31f-430b-4c7f-9c33-7d0cadd44d31,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.625 187212 DEBUG nova.network.os_vif_util [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converting VIF {"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.625 187212 DEBUG nova.network.os_vif_util [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.626 187212 DEBUG nova.objects.instance [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lazy-loading 'pci_devices' on Instance uuid 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.638 187212 DEBUG nova.compute.provider_tree [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.641 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  <uuid>7b8cf31f-430b-4c7f-9c33-7d0cadd44d31</uuid>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  <name>instance-00000018</name>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <nova:name>tempest-AttachInterfacesV270Test-server-1046212835</nova:name>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:01:34</nova:creationTime>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:        <nova:user uuid="9a5b1ecad65045afbe3c154494417765">tempest-AttachInterfacesV270Test-1975383464-project-member</nova:user>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:        <nova:project uuid="2c184f0f2b71412fb560981314d0574d">tempest-AttachInterfacesV270Test-1975383464</nova:project>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:        <nova:port uuid="02d6eab5-4561-4d9f-ad9a-169b57667224">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <entry name="serial">7b8cf31f-430b-4c7f-9c33-7d0cadd44d31</entry>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <entry name="uuid">7b8cf31f-430b-4c7f-9c33-7d0cadd44d31</entry>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.config"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:d4:b7:ec"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <target dev="tap02d6eab5-45"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/console.log" append="off"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:01:34 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:01:34 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:01:34 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:01:34 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.642 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Preparing to wait for external event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.642 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.642 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.642 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.643 187212 DEBUG nova.virt.libvirt.vif [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1046212835',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1046212835',id=24,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2c184f0f2b71412fb560981314d0574d',ramdisk_id='',reservation_id='r-gepf0n33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1975383464',owner_user_name='tempest-AttachInterfacesV270Test-1975383464-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:27Z,user_data=None,user_id='9a5b1ecad65045afbe3c154494417765',uuid=7b8cf31f-430b-4c7f-9c33-7d0cadd44d31,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.643 187212 DEBUG nova.network.os_vif_util [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converting VIF {"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.644 187212 DEBUG nova.network.os_vif_util [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.644 187212 DEBUG os_vif [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.645 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.645 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.645 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.648 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.648 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02d6eab5-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.648 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap02d6eab5-45, col_values=(('external_ids', {'iface-id': '02d6eab5-4561-4d9f-ad9a-169b57667224', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:b7:ec', 'vm-uuid': '7b8cf31f-430b-4c7f-9c33-7d0cadd44d31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.652 187212 DEBUG nova.scheduler.client.report [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.664 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:34 np0005546909 NetworkManager[55691]: <info>  [1764936094.6657] manager: (tap02d6eab5-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.668 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.672 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.673 187212 INFO os_vif [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45')#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.676 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.677 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.728 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.729 187212 DEBUG nova.network.neutron [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.746 187212 INFO nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.752 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.753 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.753 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No VIF found with MAC fa:16:3e:d4:b7:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.753 187212 INFO nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Using config drive#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.769 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.885 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.888 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.888 187212 INFO nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Creating image(s)#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.889 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "/var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.889 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "/var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.890 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "/var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.906 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.968 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.969 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.970 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:34 np0005546909 nova_compute[187208]: 2025-12-05 12:01:34.988 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.010 187212 DEBUG nova.network.neutron [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Successfully updated port: 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.030 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "refresh_cache-1282e776-5758-493b-8f52-59839ebcd31b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.030 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquired lock "refresh_cache-1282e776-5758-493b-8f52-59839ebcd31b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.030 187212 DEBUG nova.network.neutron [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.050 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.051 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.071 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.072 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.089 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.090 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.090 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.156 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.157 187212 DEBUG nova.virt.disk.api [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Checking if we can resize image /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.158 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.180 187212 DEBUG nova.compute.manager [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received event network-changed-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.180 187212 DEBUG nova.compute.manager [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Refreshing instance network info cache due to event network-changed-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.180 187212 DEBUG oslo_concurrency.lockutils [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-1282e776-5758-493b-8f52-59839ebcd31b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.203 187212 INFO nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Creating config drive at /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.config#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.208 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvm__mfxf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.233 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.234 187212 DEBUG nova.virt.disk.api [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Cannot resize image /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.235 187212 DEBUG nova.objects.instance [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lazy-loading 'migration_context' on Instance uuid adc15883-b705-42dd-ac95-04f4b8964012 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.247 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.247 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Ensure instance console log exists: /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.248 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.248 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.248 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.284 187212 DEBUG nova.policy [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '79758a6c7516459bb1907270241d266a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '342e6d694cf6482c9f1b7557a17bce60', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.298 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.298 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.300 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.336 187212 DEBUG oslo_concurrency.processutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvm__mfxf" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:35 np0005546909 kernel: tap02d6eab5-45: entered promiscuous mode
Dec  5 07:01:35 np0005546909 NetworkManager[55691]: <info>  [1764936095.4123] manager: (tap02d6eab5-45): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Dec  5 07:01:35 np0005546909 systemd-udevd[216832]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:01:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:35Z|00117|binding|INFO|Claiming lport 02d6eab5-4561-4d9f-ad9a-169b57667224 for this chassis.
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.416 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:35Z|00118|binding|INFO|02d6eab5-4561-4d9f-ad9a-169b57667224: Claiming fa:16:3e:d4:b7:ec 10.100.0.5
Dec  5 07:01:35 np0005546909 NetworkManager[55691]: <info>  [1764936095.4290] device (tap02d6eab5-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:01:35 np0005546909 NetworkManager[55691]: <info>  [1764936095.4299] device (tap02d6eab5-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.430 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:b7:ec 10.100.0.5'], port_security=['fa:16:3e:d4:b7:ec 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7b8cf31f-430b-4c7f-9c33-7d0cadd44d31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-423f0bba-22e2-4219-9338-a671dbe69e42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c184f0f2b71412fb560981314d0574d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8510d8eb-f367-43d1-be5f-8be0c3ab7e61', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eacee27-dbb3-4c60-a47d-c1f874faea06, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=02d6eab5-4561-4d9f-ad9a-169b57667224) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.431 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 02d6eab5-4561-4d9f-ad9a-169b57667224 in datapath 423f0bba-22e2-4219-9338-a671dbe69e42 bound to our chassis#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.434 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 423f0bba-22e2-4219-9338-a671dbe69e42#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.440 187212 DEBUG nova.network.neutron [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.446 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9a05ebc3-8231-4088-b051-c825999dea6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.447 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap423f0bba-21 in ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.449 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap423f0bba-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.449 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3e1c8ca0-46e7-4f5d-9f71-6f7c06cefff6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.450 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[07b72dc4-8c3f-49e3-95ef-2017bce822e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 systemd-machined[153543]: New machine qemu-27-instance-00000018.
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.465 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[fa251933-9fd8-4beb-82cb-a6e9c1c66e57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.474 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:35Z|00119|binding|INFO|Setting lport 02d6eab5-4561-4d9f-ad9a-169b57667224 ovn-installed in OVS
Dec  5 07:01:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:35Z|00120|binding|INFO|Setting lport 02d6eab5-4561-4d9f-ad9a-169b57667224 up in Southbound
Dec  5 07:01:35 np0005546909 systemd[1]: Started Virtual Machine qemu-27-instance-00000018.
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.481 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.493 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c72c708e-df43-4f02-9116-0f672643e3c2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.525 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[93bcea32-749c-4592-95f8-2d96a7dba886]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.530 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e095c130-f519-4879-b70e-119f1ec738ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 NetworkManager[55691]: <info>  [1764936095.5320] manager: (tap423f0bba-20): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.564 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[eaee02f9-6d08-433b-8c7f-9598a4331851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.568 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[78d5f2ce-c38a-4dd1-b605-488d0762e756]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 NetworkManager[55691]: <info>  [1764936095.5906] device (tap423f0bba-20): carrier: link connected
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.595 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[92a3e489-a209-454b-9a57-461a84c8a8c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.615 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8b3c3c49-cd4a-4a62-94f5-2047203b5d8a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap423f0bba-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:51:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347615, 'reachable_time': 21997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216916, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.631 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a4d0e867-c2a1-4b7d-9f75-b281a6efd3c8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe57:51bc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347615, 'tstamp': 347615}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216919, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.647 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[30c545e8-1e42-4e9b-8ee2-abbfd0093cfc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap423f0bba-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:51:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347615, 'reachable_time': 21997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216924, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.679 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[db76bc1b-b97a-4da3-a401-fd6423a001e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.722 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936095.7215908, 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.722 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] VM Started (Lifecycle Event)#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.731 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4f422dd6-b1e2-45c1-b410-cdcae528dcfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.732 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap423f0bba-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.732 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.733 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap423f0bba-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.744 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.747 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936095.7217267, 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.747 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.770 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.773 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.778 187212 DEBUG nova.compute.manager [req-17a8f765-c0ba-4559-91e1-e67bd1a581c6 req-a13d951c-6fbf-4b2a-aaf9-7a5ca49c294b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.778 187212 DEBUG oslo_concurrency.lockutils [req-17a8f765-c0ba-4559-91e1-e67bd1a581c6 req-a13d951c-6fbf-4b2a-aaf9-7a5ca49c294b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.779 187212 DEBUG oslo_concurrency.lockutils [req-17a8f765-c0ba-4559-91e1-e67bd1a581c6 req-a13d951c-6fbf-4b2a-aaf9-7a5ca49c294b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.779 187212 DEBUG oslo_concurrency.lockutils [req-17a8f765-c0ba-4559-91e1-e67bd1a581c6 req-a13d951c-6fbf-4b2a-aaf9-7a5ca49c294b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.779 187212 DEBUG nova.compute.manager [req-17a8f765-c0ba-4559-91e1-e67bd1a581c6 req-a13d951c-6fbf-4b2a-aaf9-7a5ca49c294b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Processing event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.780 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.782 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:35 np0005546909 kernel: tap423f0bba-20: entered promiscuous mode
Dec  5 07:01:35 np0005546909 NetworkManager[55691]: <info>  [1764936095.7836] manager: (tap423f0bba-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.785 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap423f0bba-20, col_values=(('external_ids', {'iface-id': '8801ec73-6ce8-4039-ab6c-4693dcbc877e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.786 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.787 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:35Z|00121|binding|INFO|Releasing lport 8801ec73-6ce8-4039-ab6c-4693dcbc877e from this chassis (sb_readonly=0)
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.790 187212 INFO nova.virt.libvirt.driver [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Instance spawned successfully.#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.790 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.792 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.792 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936095.782405, 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.792 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.798 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.801 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/423f0bba-22e2-4219-9338-a671dbe69e42.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/423f0bba-22e2-4219-9338-a671dbe69e42.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.802 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a703711d-a87d-44d1-9de3-625eac5e2ad7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.803 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-423f0bba-22e2-4219-9338-a671dbe69e42
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/423f0bba-22e2-4219-9338-a671dbe69e42.pid.haproxy
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 423f0bba-22e2-4219-9338-a671dbe69e42
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:01:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:35.804 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'env', 'PROCESS_TAG=haproxy-423f0bba-22e2-4219-9338-a671dbe69e42', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/423f0bba-22e2-4219-9338-a671dbe69e42.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.810 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.814 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.814 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.814 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.815 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.815 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.815 187212 DEBUG nova.virt.libvirt.driver [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.819 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.869 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.889 187212 INFO nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Took 8.35 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.889 187212 DEBUG nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.944 187212 INFO nova.compute.manager [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Took 8.85 seconds to build instance.#033[00m
Dec  5 07:01:35 np0005546909 nova_compute[187208]: 2025-12-05 12:01:35.970 187212 DEBUG oslo_concurrency.lockutils [None req-22e7d724-4ba7-4e0b-80a8-3662befa19d1 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.947s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:36 np0005546909 nova_compute[187208]: 2025-12-05 12:01:36.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:01:36 np0005546909 nova_compute[187208]: 2025-12-05 12:01:36.083 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:01:36 np0005546909 nova_compute[187208]: 2025-12-05 12:01:36.084 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:01:36 np0005546909 nova_compute[187208]: 2025-12-05 12:01:36.177 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:36 np0005546909 nova_compute[187208]: 2025-12-05 12:01:36.178 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:36 np0005546909 nova_compute[187208]: 2025-12-05 12:01:36.178 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:01:36 np0005546909 nova_compute[187208]: 2025-12-05 12:01:36.180 187212 DEBUG nova.network.neutron [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updated VIF entry in instance network info cache for port 02d6eab5-4561-4d9f-ad9a-169b57667224. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:01:36 np0005546909 nova_compute[187208]: 2025-12-05 12:01:36.181 187212 DEBUG nova.network.neutron [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updating instance_info_cache with network_info: [{"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:36 np0005546909 nova_compute[187208]: 2025-12-05 12:01:36.204 187212 DEBUG oslo_concurrency.lockutils [req-37519478-f494-4b86-88eb-f62e8c03e26d req-37317ab7-9fcd-4802-bf4d-4d7ce1e0751e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:36 np0005546909 podman[216958]: 2025-12-05 12:01:36.162036717 +0000 UTC m=+0.029387420 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:01:36 np0005546909 podman[216958]: 2025-12-05 12:01:36.49398583 +0000 UTC m=+0.361336483 container create 719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:01:36 np0005546909 systemd[1]: Started libpod-conmon-719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8.scope.
Dec  5 07:01:36 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:01:36 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c6b011a5a4020501a80db3fcb573c57c4bdcfb8d5de9a077c6de3d75c9302b5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:01:36 np0005546909 podman[216972]: 2025-12-05 12:01:36.576414905 +0000 UTC m=+0.058054700 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:01:36 np0005546909 podman[216958]: 2025-12-05 12:01:36.581790659 +0000 UTC m=+0.449141342 container init 719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 07:01:36 np0005546909 podman[216958]: 2025-12-05 12:01:36.587253425 +0000 UTC m=+0.454604078 container start 719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:01:36 np0005546909 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [NOTICE]   (217003) : New worker (217006) forked
Dec  5 07:01:36 np0005546909 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [NOTICE]   (217003) : Loading success.
Dec  5 07:01:36 np0005546909 nova_compute[187208]: 2025-12-05 12:01:36.977 187212 DEBUG oslo_concurrency.lockutils [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "interface-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:36 np0005546909 nova_compute[187208]: 2025-12-05 12:01:36.977 187212 DEBUG oslo_concurrency.lockutils [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "interface-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:36 np0005546909 nova_compute[187208]: 2025-12-05 12:01:36.978 187212 DEBUG nova.objects.instance [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lazy-loading 'flavor' on Instance uuid 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.019 187212 DEBUG nova.objects.instance [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lazy-loading 'pci_requests' on Instance uuid 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.036 187212 DEBUG nova.network.neutron [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.374 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.412 187212 DEBUG nova.network.neutron [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Updating instance_info_cache with network_info: [{"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.438 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Releasing lock "refresh_cache-1282e776-5758-493b-8f52-59839ebcd31b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.438 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Instance network_info: |[{"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.438 187212 DEBUG oslo_concurrency.lockutils [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-1282e776-5758-493b-8f52-59839ebcd31b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.439 187212 DEBUG nova.network.neutron [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Refreshing network info cache for port 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.441 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Start _get_guest_xml network_info=[{"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.446 187212 WARNING nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.451 187212 DEBUG nova.virt.libvirt.host [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.452 187212 DEBUG nova.virt.libvirt.host [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.455 187212 DEBUG nova.virt.libvirt.host [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.456 187212 DEBUG nova.virt.libvirt.host [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.456 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.457 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.457 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.458 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.458 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.458 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.459 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.459 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.459 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.459 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.460 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.460 187212 DEBUG nova.virt.hardware [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.465 187212 DEBUG nova.virt.libvirt.vif [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1341604448',display_name='tempest-ImagesNegativeTestJSON-server-1341604448',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1341604448',id=23,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8400e354e93c4b33b8d683012dfe5c94',ramdisk_id='',reservation_id='r-7yvjshh1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-965315619',owner_user_name='tempest-ImagesNegativeTestJSON-965315619-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:25Z,user_data=None,user_id='496da6872d53413ea1c201178cf5b05c',uuid=1282e776-5758-493b-8f52-59839ebcd31b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.466 187212 DEBUG nova.network.os_vif_util [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Converting VIF {"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.466 187212 DEBUG nova.network.os_vif_util [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.468 187212 DEBUG nova.objects.instance [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1282e776-5758-493b-8f52-59839ebcd31b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.490 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  <uuid>1282e776-5758-493b-8f52-59839ebcd31b</uuid>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  <name>instance-00000017</name>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <nova:name>tempest-ImagesNegativeTestJSON-server-1341604448</nova:name>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:01:37</nova:creationTime>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:        <nova:user uuid="496da6872d53413ea1c201178cf5b05c">tempest-ImagesNegativeTestJSON-965315619-project-member</nova:user>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:        <nova:project uuid="8400e354e93c4b33b8d683012dfe5c94">tempest-ImagesNegativeTestJSON-965315619</nova:project>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:        <nova:port uuid="9bb4b8ce-5722-4698-aa3d-6d891ab14b0d">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <entry name="serial">1282e776-5758-493b-8f52-59839ebcd31b</entry>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <entry name="uuid">1282e776-5758-493b-8f52-59839ebcd31b</entry>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.config"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:06:f0:53"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <target dev="tap9bb4b8ce-57"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/console.log" append="off"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:01:37 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:01:37 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:01:37 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:01:37 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.491 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Preparing to wait for external event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.491 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.492 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.492 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.493 187212 DEBUG nova.virt.libvirt.vif [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1341604448',display_name='tempest-ImagesNegativeTestJSON-server-1341604448',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1341604448',id=23,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8400e354e93c4b33b8d683012dfe5c94',ramdisk_id='',reservation_id='r-7yvjshh1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-965315619',owner_user_name='tempest-ImagesNegativeTestJSON-965315619-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:25Z,user_data=None,user_id='496da6872d53413ea1c201178cf5b05c',uuid=1282e776-5758-493b-8f52-59839ebcd31b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.493 187212 DEBUG nova.network.os_vif_util [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Converting VIF {"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.494 187212 DEBUG nova.network.os_vif_util [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.494 187212 DEBUG os_vif [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.495 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.495 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.496 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.500 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.500 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bb4b8ce-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.501 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9bb4b8ce-57, col_values=(('external_ids', {'iface-id': '9bb4b8ce-5722-4698-aa3d-6d891ab14b0d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:f0:53', 'vm-uuid': '1282e776-5758-493b-8f52-59839ebcd31b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.502 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:37 np0005546909 NetworkManager[55691]: <info>  [1764936097.5039] manager: (tap9bb4b8ce-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.510 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.512 187212 INFO os_vif [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57')#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.588 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Updating instance_info_cache with network_info: [{"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.640 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-982a8e69-5181-4847-bdfe-8d4de12bb2e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.641 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.649 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.653 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.724 187212 DEBUG nova.policy [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a5b1ecad65045afbe3c154494417765', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2c184f0f2b71412fb560981314d0574d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.756 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.757 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.757 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.757 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.932 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.932 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.933 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] No VIF found with MAC fa:16:3e:06:f0:53, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.933 187212 INFO nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Using config drive#033[00m
Dec  5 07:01:37 np0005546909 nova_compute[187208]: 2025-12-05 12:01:37.962 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.021 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.023 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.084 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.086 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000017, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.config'#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.091 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.147 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.149 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.204 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.211 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.267 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.268 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.322 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.328 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.385 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.386 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.438 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.444 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.500 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.502 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.523 187212 INFO nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Creating config drive at /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.config#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.529 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7c7gn5vm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.560 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.566 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.621 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.622 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.653 187212 DEBUG oslo_concurrency.processutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7c7gn5vm" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.682 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.714 187212 DEBUG nova.network.neutron [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Successfully created port: 78310fa8-21e8-49e5-8b60-867d1089ad71 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:01:38 np0005546909 kernel: tap9bb4b8ce-57: entered promiscuous mode
Dec  5 07:01:38 np0005546909 NetworkManager[55691]: <info>  [1764936098.7184] manager: (tap9bb4b8ce-57): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Dec  5 07:01:38 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:38Z|00122|binding|INFO|Claiming lport 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d for this chassis.
Dec  5 07:01:38 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:38Z|00123|binding|INFO|9bb4b8ce-5722-4698-aa3d-6d891ab14b0d: Claiming fa:16:3e:06:f0:53 10.100.0.14
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.727 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.732 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:f0:53 10.100.0.14'], port_security=['fa:16:3e:06:f0:53 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1282e776-5758-493b-8f52-59839ebcd31b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455bb7e1-6680-472e-861f-da50aef09a7f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8400e354e93c4b33b8d683012dfe5c94', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9474b356-5c55-44a1-af48-0eeaf9a9ad0b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07a9feeb-8467-4a6f-b0e2-fda2f133d3ac, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.734 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d in datapath 455bb7e1-6680-472e-861f-da50aef09a7f bound to our chassis#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.736 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 455bb7e1-6680-472e-861f-da50aef09a7f#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.748 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e6d41cae-5357-4a90-9b9e-ac92e16d491c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.749 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap455bb7e1-61 in ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.752 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap455bb7e1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.752 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d93dd69c-33ed-4fa2-ba43-5fc87d249d68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.753 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d81b7850-c9fa-4cfa-b91c-d71b43830c92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 systemd-udevd[217073]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:01:38 np0005546909 systemd-machined[153543]: New machine qemu-28-instance-00000017.
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.770 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8a3a5fae-5726-4441-ab26-07e4c6d01a5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 NetworkManager[55691]: <info>  [1764936098.7771] device (tap9bb4b8ce-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:01:38 np0005546909 NetworkManager[55691]: <info>  [1764936098.7781] device (tap9bb4b8ce-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:01:38 np0005546909 systemd[1]: Started Virtual Machine qemu-28-instance-00000017.
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.785 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:38 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:38Z|00124|binding|INFO|Setting lport 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d ovn-installed in OVS
Dec  5 07:01:38 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:38Z|00125|binding|INFO|Setting lport 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d up in Southbound
Dec  5 07:01:38 np0005546909 nova_compute[187208]: 2025-12-05 12:01:38.789 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.796 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a5666218-a345-46d7-b3e2-d75a8a00e48b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.822 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[01f78893-9813-469b-bc8d-020e096de40a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.827 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[be98a400-6ec5-49e8-ba4c-ab315f19d56d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 NetworkManager[55691]: <info>  [1764936098.8293] manager: (tap455bb7e1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/62)
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.854 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[119c2620-de64-4b96-99e8-31b3a39c3441]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.857 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf52328-b3bb-461b-b218-ed64d076ee32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 NetworkManager[55691]: <info>  [1764936098.8802] device (tap455bb7e1-60): carrier: link connected
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.886 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c477cbef-1335-4058-b652-01b238068274]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.904 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a614fc-4f4f-47ed-9128-7386d5d629b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455bb7e1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:a2:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347944, 'reachable_time': 33412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217105, 'error': None, 'target': 'ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.920 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e85749b7-51b5-4ba1-a99b-58deed3bd1fe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:a24a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347944, 'tstamp': 347944}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217106, 'error': None, 'target': 'ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.939 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[05fefbbc-4bd1-4712-8553-c5d66ba6bde4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap455bb7e1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:a2:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347944, 'reachable_time': 33412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217107, 'error': None, 'target': 'ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:38.972 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e321311a-9284-4e8f-94e3-6257b30c7697]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.005 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.007 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5053MB free_disk=73.21169662475586GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.007 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.008 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.028 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[52279ea9-3785-4f07-baf2-ff853ea3254e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.030 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455bb7e1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.030 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.030 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap455bb7e1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:39 np0005546909 NetworkManager[55691]: <info>  [1764936099.0661] manager: (tap455bb7e1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.066 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:39 np0005546909 kernel: tap455bb7e1-60: entered promiscuous mode
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.068 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap455bb7e1-60, col_values=(('external_ids', {'iface-id': '261e0bd9-3b3f-4cf7-b0f8-84547701ff1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:39Z|00126|binding|INFO|Releasing lport 261e0bd9-3b3f-4cf7-b0f8-84547701ff1a from this chassis (sb_readonly=0)
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.069 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.081 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.082 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/455bb7e1-6680-472e-861f-da50aef09a7f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/455bb7e1-6680-472e-861f-da50aef09a7f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.082 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a8458e30-4961-416a-b0b4-52481bf6fb67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.083 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-455bb7e1-6680-472e-861f-da50aef09a7f
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/455bb7e1-6680-472e-861f-da50aef09a7f.pid.haproxy
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 455bb7e1-6680-472e-861f-da50aef09a7f
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:01:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:39.084 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f', 'env', 'PROCESS_TAG=haproxy-455bb7e1-6680-472e-861f-da50aef09a7f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/455bb7e1-6680-472e-861f-da50aef09a7f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.112 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 982a8e69-5181-4847-bdfe-8d4de12bb2e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.113 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.113 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 4e7aec76-673e-48b5-b183-cc9c7a95fd37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.113 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance d95c0324-d1d3-4960-9ab7-3a2a098a9f7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.113 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 1282e776-5758-493b-8f52-59839ebcd31b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.113 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.114 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance adc15883-b705-42dd-ac95-04f4b8964012 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.114 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 7 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.114 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1408MB phys_disk=79GB used_disk=7GB total_vcpus=8 used_vcpus=7 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.277 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.297 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.320 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.321 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.420 187212 DEBUG nova.compute.manager [req-023750c4-2696-46e8-bfff-3a277cf96ad9 req-95c103ff-473b-4981-b098-979414a8a768 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.421 187212 DEBUG oslo_concurrency.lockutils [req-023750c4-2696-46e8-bfff-3a277cf96ad9 req-95c103ff-473b-4981-b098-979414a8a768 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.421 187212 DEBUG oslo_concurrency.lockutils [req-023750c4-2696-46e8-bfff-3a277cf96ad9 req-95c103ff-473b-4981-b098-979414a8a768 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.422 187212 DEBUG oslo_concurrency.lockutils [req-023750c4-2696-46e8-bfff-3a277cf96ad9 req-95c103ff-473b-4981-b098-979414a8a768 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.422 187212 DEBUG nova.compute.manager [req-023750c4-2696-46e8-bfff-3a277cf96ad9 req-95c103ff-473b-4981-b098-979414a8a768 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.422 187212 WARNING nova.compute.manager [req-023750c4-2696-46e8-bfff-3a277cf96ad9 req-95c103ff-473b-4981-b098-979414a8a768 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:01:39 np0005546909 podman[217139]: 2025-12-05 12:01:39.458583952 +0000 UTC m=+0.051331167 container create 62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  5 07:01:39 np0005546909 systemd[1]: Started libpod-conmon-62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b.scope.
Dec  5 07:01:39 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:01:39 np0005546909 podman[217139]: 2025-12-05 12:01:39.432190108 +0000 UTC m=+0.024937343 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:01:39 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4149b47c4e6793ce3b6fc5ffdd499aa39b6bd1d4bb7bbc0659f951080559deea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:01:39 np0005546909 podman[217139]: 2025-12-05 12:01:39.549981543 +0000 UTC m=+0.142728758 container init 62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:01:39 np0005546909 podman[217139]: 2025-12-05 12:01:39.558827266 +0000 UTC m=+0.151574481 container start 62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:01:39 np0005546909 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [NOTICE]   (217165) : New worker (217168) forked
Dec  5 07:01:39 np0005546909 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [NOTICE]   (217165) : Loading success.
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.626 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936099.626391, 1282e776-5758-493b-8f52-59839ebcd31b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.627 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] VM Started (Lifecycle Event)#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.647 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.651 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936099.6264863, 1282e776-5758-493b-8f52-59839ebcd31b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.651 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.673 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.676 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.695 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.728 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.730 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.731 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:01:39 np0005546909 nova_compute[187208]: 2025-12-05 12:01:39.731 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.208 187212 DEBUG nova.network.neutron [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Updated VIF entry in instance network info cache for port 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.209 187212 DEBUG nova.network.neutron [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Updating instance_info_cache with network_info: [{"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.215 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.215 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.239 187212 DEBUG oslo_concurrency.lockutils [req-2a05d7cd-079d-478a-beff-4ba655640094 req-32227444-111d-4d4e-9c10-b28840eeef8e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-1282e776-5758-493b-8f52-59839ebcd31b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.240 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.309 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.309 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.316 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.316 187212 INFO nova.compute.claims [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.361 187212 DEBUG nova.network.neutron [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Successfully created port: 0d74b914-0dbd-4356-8304-a42943811e2e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.543 187212 DEBUG nova.compute.provider_tree [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.601 187212 DEBUG nova.scheduler.client.report [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.867 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.868 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.946 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:01:40 np0005546909 nova_compute[187208]: 2025-12-05 12:01:40.947 187212 DEBUG nova.network.neutron [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.050 187212 INFO nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.152 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.457 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.459 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.459 187212 INFO nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Creating image(s)#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.460 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "/var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.461 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "/var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.462 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "/var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.477 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.543 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.544 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.545 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.556 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.611 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.612 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.655 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.657 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.657 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.717 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.718 187212 DEBUG nova.virt.disk.api [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Checking if we can resize image /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.719 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.783 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.784 187212 DEBUG nova.virt.disk.api [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Cannot resize image /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.784 187212 DEBUG nova.objects.instance [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lazy-loading 'migration_context' on Instance uuid 1606eea3-5389-4437-b0f9-cfe6084d7871 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.789 187212 DEBUG nova.policy [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff53b25ec85543eeb2bdea04a6eeaac4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3cd52d70d1a4be8ae891298ff7e1018', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.806 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.806 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Ensure instance console log exists: /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.807 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.807 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:41 np0005546909 nova_compute[187208]: 2025-12-05 12:01:41.807 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:42 np0005546909 nova_compute[187208]: 2025-12-05 12:01:42.376 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:42 np0005546909 nova_compute[187208]: 2025-12-05 12:01:42.501 187212 DEBUG nova.network.neutron [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Successfully updated port: 78310fa8-21e8-49e5-8b60-867d1089ad71 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:01:42 np0005546909 nova_compute[187208]: 2025-12-05 12:01:42.503 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:42 np0005546909 nova_compute[187208]: 2025-12-05 12:01:42.548 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:42 np0005546909 nova_compute[187208]: 2025-12-05 12:01:42.548 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquired lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:42 np0005546909 nova_compute[187208]: 2025-12-05 12:01:42.549 187212 DEBUG nova.network.neutron [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:01:42 np0005546909 nova_compute[187208]: 2025-12-05 12:01:42.927 187212 DEBUG nova.network.neutron [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:01:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:43.302 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:43 np0005546909 nova_compute[187208]: 2025-12-05 12:01:43.535 187212 DEBUG nova.compute.manager [req-7df4745c-2731-4db3-8236-f9ea8880234f req-ce2f5733-fa23-48ae-a3fc-32d6eda58798 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:43 np0005546909 nova_compute[187208]: 2025-12-05 12:01:43.535 187212 DEBUG oslo_concurrency.lockutils [req-7df4745c-2731-4db3-8236-f9ea8880234f req-ce2f5733-fa23-48ae-a3fc-32d6eda58798 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:43 np0005546909 nova_compute[187208]: 2025-12-05 12:01:43.536 187212 DEBUG oslo_concurrency.lockutils [req-7df4745c-2731-4db3-8236-f9ea8880234f req-ce2f5733-fa23-48ae-a3fc-32d6eda58798 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:43 np0005546909 nova_compute[187208]: 2025-12-05 12:01:43.536 187212 DEBUG oslo_concurrency.lockutils [req-7df4745c-2731-4db3-8236-f9ea8880234f req-ce2f5733-fa23-48ae-a3fc-32d6eda58798 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:43 np0005546909 nova_compute[187208]: 2025-12-05 12:01:43.536 187212 DEBUG nova.compute.manager [req-7df4745c-2731-4db3-8236-f9ea8880234f req-ce2f5733-fa23-48ae-a3fc-32d6eda58798 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No event matching network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d in dict_keys([('network-vif-plugged', '380c99a7-9480-45f8-b2f4-adfcdfa8576d')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Dec  5 07:01:43 np0005546909 nova_compute[187208]: 2025-12-05 12:01:43.536 187212 WARNING nova.compute.manager [req-7df4745c-2731-4db3-8236-f9ea8880234f req-ce2f5733-fa23-48ae-a3fc-32d6eda58798 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state active and task_state rebuild_spawning.#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.025 187212 DEBUG nova.network.neutron [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Successfully updated port: 0d74b914-0dbd-4356-8304-a42943811e2e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.051 187212 DEBUG oslo_concurrency.lockutils [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.052 187212 DEBUG oslo_concurrency.lockutils [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquired lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.052 187212 DEBUG nova.network.neutron [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.421 187212 WARNING nova.network.neutron [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] 423f0bba-22e2-4219-9338-a671dbe69e42 already exists in list: networks containing: ['423f0bba-22e2-4219-9338-a671dbe69e42']. ignoring it#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.731 187212 DEBUG nova.network.neutron [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Updating instance_info_cache with network_info: [{"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.748 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Releasing lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.748 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Instance network_info: |[{"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.750 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Start _get_guest_xml network_info=[{"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.756 187212 WARNING nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.763 187212 DEBUG nova.virt.libvirt.host [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.764 187212 DEBUG nova.virt.libvirt.host [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.769 187212 DEBUG nova.virt.libvirt.host [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.769 187212 DEBUG nova.virt.libvirt.host [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.770 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.770 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:01:21Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1144768517',id=26,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-837660852',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.770 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.770 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.771 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.771 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.771 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.771 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.771 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.772 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.772 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.772 187212 DEBUG nova.virt.hardware [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.776 187212 DEBUG nova.virt.libvirt.vif [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-555517467',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-555517467',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(26),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-555517467',id=25,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=26,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHclZt3lDeuFOP8poKE+ML8+DG1Fbw3aUsTnjf0HLJVz5RSbJGx4tv2GGPcCJx4ta3mNRAE5Oj+av9qQ6qgWWoPyu4x9SJdJ+NWU4lkfCG3kIVf4et9X/7mGn0JPIZgI2A==',key_name='tempest-keypair-270659961',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='342e6d694cf6482c9f1b7557a17bce60',ramdisk_id='',reservation_id='r-hjkfnf9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79758a6c7516459bb1907270241d266a',uuid=adc15883-b705-42dd-ac95-04f4b8964012,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.776 187212 DEBUG nova.network.os_vif_util [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converting VIF {"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.776 187212 DEBUG nova.network.os_vif_util [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.777 187212 DEBUG nova.objects.instance [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lazy-loading 'pci_devices' on Instance uuid adc15883-b705-42dd-ac95-04f4b8964012 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.790 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  <uuid>adc15883-b705-42dd-ac95-04f4b8964012</uuid>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  <name>instance-00000019</name>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-555517467</nova:name>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:01:44</nova:creationTime>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <nova:flavor name="tempest-flavor_with_ephemeral_0-837660852">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:        <nova:user uuid="79758a6c7516459bb1907270241d266a">tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member</nova:user>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:        <nova:project uuid="342e6d694cf6482c9f1b7557a17bce60">tempest-ServersWithSpecificFlavorTestJSON-1976479976</nova:project>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:        <nova:port uuid="78310fa8-21e8-49e5-8b60-867d1089ad71">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <entry name="serial">adc15883-b705-42dd-ac95-04f4b8964012</entry>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <entry name="uuid">adc15883-b705-42dd-ac95-04f4b8964012</entry>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.config"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:c8:42:5d"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <target dev="tap78310fa8-21"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/console.log" append="off"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:01:44 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:01:44 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:01:44 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:01:44 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.791 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Preparing to wait for external event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.791 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.791 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.792 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.792 187212 DEBUG nova.virt.libvirt.vif [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-555517467',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-555517467',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(26),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-555517467',id=25,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=26,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHclZt3lDeuFOP8poKE+ML8+DG1Fbw3aUsTnjf0HLJVz5RSbJGx4tv2GGPcCJx4ta3mNRAE5Oj+av9qQ6qgWWoPyu4x9SJdJ+NWU4lkfCG3kIVf4et9X/7mGn0JPIZgI2A==',key_name='tempest-keypair-270659961',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='342e6d694cf6482c9f1b7557a17bce60',ramdisk_id='',reservation_id='r-hjkfnf9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79758a6c7516459bb1907270241d266a',uuid=adc15883-b705-42dd-ac95-04f4b8964012,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.793 187212 DEBUG nova.network.os_vif_util [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converting VIF {"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.794 187212 DEBUG nova.network.os_vif_util [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.794 187212 DEBUG os_vif [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.795 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.795 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.796 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.799 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.800 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap78310fa8-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.800 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap78310fa8-21, col_values=(('external_ids', {'iface-id': '78310fa8-21e8-49e5-8b60-867d1089ad71', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:42:5d', 'vm-uuid': 'adc15883-b705-42dd-ac95-04f4b8964012'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:44 np0005546909 NetworkManager[55691]: <info>  [1764936104.8405] manager: (tap78310fa8-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.842 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.844 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.848 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.849 187212 INFO os_vif [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21')#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.871 187212 DEBUG nova.network.neutron [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Successfully created port: c72089e0-4937-40b6-86b5-f9d6d0982058 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.917 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.918 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.918 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No VIF found with MAC fa:16:3e:c8:42:5d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:01:44 np0005546909 nova_compute[187208]: 2025-12-05 12:01:44.919 187212 INFO nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Using config drive#033[00m
Dec  5 07:01:44 np0005546909 podman[217208]: 2025-12-05 12:01:44.969991398 +0000 UTC m=+0.073866381 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:01:45 np0005546909 nova_compute[187208]: 2025-12-05 12:01:45.739 187212 INFO nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Creating config drive at /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.config#033[00m
Dec  5 07:01:45 np0005546909 nova_compute[187208]: 2025-12-05 12:01:45.745 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi8bht2hb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:45 np0005546909 nova_compute[187208]: 2025-12-05 12:01:45.879 187212 DEBUG oslo_concurrency.processutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi8bht2hb" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:45 np0005546909 NetworkManager[55691]: <info>  [1764936105.9446] manager: (tap78310fa8-21): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Dec  5 07:01:45 np0005546909 kernel: tap78310fa8-21: entered promiscuous mode
Dec  5 07:01:45 np0005546909 systemd-udevd[217244]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:01:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:45Z|00127|binding|INFO|Claiming lport 78310fa8-21e8-49e5-8b60-867d1089ad71 for this chassis.
Dec  5 07:01:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:45Z|00128|binding|INFO|78310fa8-21e8-49e5-8b60-867d1089ad71: Claiming fa:16:3e:c8:42:5d 10.100.0.11
Dec  5 07:01:45 np0005546909 nova_compute[187208]: 2025-12-05 12:01:45.995 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.000 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:42:5d 10.100.0.11'], port_security=['fa:16:3e:c8:42:5d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'adc15883-b705-42dd-ac95-04f4b8964012', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '342e6d694cf6482c9f1b7557a17bce60', 'neutron:revision_number': '2', 'neutron:security_group_ids': '710ea28e-d1ba-4c63-a751-16b460b2129b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a85cd729-c72e-4d3c-b444-ff0b42d436ff, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=78310fa8-21e8-49e5-8b60-867d1089ad71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.001 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 78310fa8-21e8-49e5-8b60-867d1089ad71 in datapath 393d33f9-2dde-4fb5-b5db-3f0fb98d4637 bound to our chassis#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.006 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 393d33f9-2dde-4fb5-b5db-3f0fb98d4637#033[00m
Dec  5 07:01:46 np0005546909 NetworkManager[55691]: <info>  [1764936106.0088] device (tap78310fa8-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:01:46 np0005546909 NetworkManager[55691]: <info>  [1764936106.0099] device (tap78310fa8-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.019 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[91f8a959-e5d2-4f59-98c7-7c9acbb2b526]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.020 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap393d33f9-21 in ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.023 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap393d33f9-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.024 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4c91fa4f-2f91-4f92-83d3-878bf932100e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.025 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[88ca9964-7479-46ef-bac6-b381fb062808]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 systemd-machined[153543]: New machine qemu-29-instance-00000019.
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.037 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[01844aeb-ad28-4b0e-b2e9-074a8b5b5acf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.047 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:46 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:46Z|00129|binding|INFO|Setting lport 78310fa8-21e8-49e5-8b60-867d1089ad71 ovn-installed in OVS
Dec  5 07:01:46 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:46Z|00130|binding|INFO|Setting lport 78310fa8-21e8-49e5-8b60-867d1089ad71 up in Southbound
Dec  5 07:01:46 np0005546909 systemd[1]: Started Virtual Machine qemu-29-instance-00000019.
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.050 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.055 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d37485-bde1-4be2-adf7-2840488d714d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.097 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[805a4542-5121-4306-bf49-7cdd9263a442]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 NetworkManager[55691]: <info>  [1764936106.1065] manager: (tap393d33f9-20): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Dec  5 07:01:46 np0005546909 systemd-udevd[217250]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.106 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[594dc9b6-7d54-418a-b562-50a331aa3a1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.142 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0c57c635-e590-4017-bb81-c7308f5c6119]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.146 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[49e51972-16b1-4567-b0a9-f0ee9d020dc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 NetworkManager[55691]: <info>  [1764936106.1837] device (tap393d33f9-20): carrier: link connected
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.191 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b490da-3e07-43d9-a9bb-44d3bb951305]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.211 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5125daff-7dcb-4db8-a91f-e5d99ea2c6cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap393d33f9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:b1:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348674, 'reachable_time': 19169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217280, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.232 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[96525c79-9fbe-498c-82a6-7b03db581dfb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe23:b198'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 348674, 'tstamp': 348674}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217281, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.257 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[deded7e2-f591-4c62-bd60-332b2dc707b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap393d33f9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:b1:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348674, 'reachable_time': 19169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217282, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.291 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[41fc2a99-7fb6-46f1-9275-d03b658c75d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.360 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[45350962-a7fd-4500-b547-25d154555e6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.361 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap393d33f9-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.361 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.361 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap393d33f9-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.363 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:46 np0005546909 NetworkManager[55691]: <info>  [1764936106.3642] manager: (tap393d33f9-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Dec  5 07:01:46 np0005546909 kernel: tap393d33f9-20: entered promiscuous mode
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.370 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap393d33f9-20, col_values=(('external_ids', {'iface-id': '4f5e3c8a-5273-4414-820c-16ae051153f4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.371 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:46 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:46Z|00131|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.374 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.375 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8652d6b4-78c3-4225-9295-6b11bfe81602]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.377 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-393d33f9-2dde-4fb5-b5db-3f0fb98d4637
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.pid.haproxy
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 393d33f9-2dde-4fb5-b5db-3f0fb98d4637
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:01:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:46.379 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'env', 'PROCESS_TAG=haproxy-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.384 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.438 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-changed-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.439 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Refreshing instance network info cache due to event network-changed-78310fa8-21e8-49e5-8b60-867d1089ad71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.440 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.440 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.440 187212 DEBUG nova.network.neutron [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Refreshing network info cache for port 78310fa8-21e8-49e5-8b60-867d1089ad71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.459 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936106.4568458, adc15883-b705-42dd-ac95-04f4b8964012 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.460 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] VM Started (Lifecycle Event)#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.480 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.487 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936106.4571378, adc15883-b705-42dd-ac95-04f4b8964012 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.487 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.508 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.514 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:46 np0005546909 nova_compute[187208]: 2025-12-05 12:01:46.535 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:01:46 np0005546909 podman[217321]: 2025-12-05 12:01:46.719206509 +0000 UTC m=+0.032636843 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.496 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:47 np0005546909 podman[217321]: 2025-12-05 12:01:47.523136675 +0000 UTC m=+0.836566989 container create f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  5 07:01:47 np0005546909 systemd[1]: Started libpod-conmon-f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751.scope.
Dec  5 07:01:47 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:01:47 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/26091b88884c53d07a76d03c6c9c66adb5d232a7c306c3b78dafe02bf1e95c96/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.668 187212 DEBUG nova.network.neutron [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updating instance_info_cache with network_info: [{"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.686 187212 DEBUG oslo_concurrency.lockutils [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Releasing lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.690 187212 DEBUG nova.virt.libvirt.vif [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1046212835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1046212835',id=24,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2c184f0f2b71412fb560981314d0574d',ramdisk_id='',reservation_id='r-gepf0n33',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1975383464',owner_user_name='tempest-AttachInterfacesV270Test-1975383464-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:01:35Z,user_data=None,user_id='9a5b1ecad65045afbe3c154494417765',uuid=7b8cf31f-430b-4c7f-9c33-7d0cadd44d31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.691 187212 DEBUG nova.network.os_vif_util [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converting VIF {"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.691 187212 DEBUG nova.network.os_vif_util [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.692 187212 DEBUG os_vif [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.692 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.692 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.693 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.695 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.695 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d74b914-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.696 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0d74b914-0d, col_values=(('external_ids', {'iface-id': '0d74b914-0dbd-4356-8304-a42943811e2e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a5:e2:69', 'vm-uuid': '7b8cf31f-430b-4c7f-9c33-7d0cadd44d31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.698 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:47 np0005546909 NetworkManager[55691]: <info>  [1764936107.6989] manager: (tap0d74b914-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.699 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.705 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.706 187212 INFO os_vif [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d')#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.706 187212 DEBUG nova.virt.libvirt.vif [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1046212835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1046212835',id=24,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2c184f0f2b71412fb560981314d0574d',ramdisk_id='',reservation_id='r-gepf0n33',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1975383464',owner_user_name='tempest-AttachInterfacesV270Test-1975383464-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:01:35Z,user_data=None,user_id='9a5b1ecad65045afbe3c154494417765',uuid=7b8cf31f-430b-4c7f-9c33-7d0cadd44d31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.707 187212 DEBUG nova.network.os_vif_util [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converting VIF {"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.707 187212 DEBUG nova.network.os_vif_util [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.710 187212 DEBUG nova.network.neutron [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Successfully updated port: c72089e0-4937-40b6-86b5-f9d6d0982058 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.711 187212 DEBUG nova.virt.libvirt.guest [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] attach device xml: <interface type="ethernet">
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  <mac address="fa:16:3e:a5:e2:69"/>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  <model type="virtio"/>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  <mtu size="1442"/>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  <target dev="tap0d74b914-0d"/>
Dec  5 07:01:47 np0005546909 nova_compute[187208]: </interface>
Dec  5 07:01:47 np0005546909 nova_compute[187208]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Dec  5 07:01:47 np0005546909 kernel: tap0d74b914-0d: entered promiscuous mode
Dec  5 07:01:47 np0005546909 NetworkManager[55691]: <info>  [1764936107.7257] manager: (tap0d74b914-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Dec  5 07:01:47 np0005546909 systemd-udevd[217262]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:01:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:47Z|00132|binding|INFO|Claiming lport 0d74b914-0dbd-4356-8304-a42943811e2e for this chassis.
Dec  5 07:01:47 np0005546909 podman[217321]: 2025-12-05 12:01:47.727117202 +0000 UTC m=+1.040547536 container init f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.727 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:47Z|00133|binding|INFO|0d74b914-0dbd-4356-8304-a42943811e2e: Claiming fa:16:3e:a5:e2:69 10.100.0.10
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.731 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.731 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquired lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.731 187212 DEBUG nova.network.neutron [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:01:47 np0005546909 podman[217321]: 2025-12-05 12:01:47.735167752 +0000 UTC m=+1.048598066 container start f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.737 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:e2:69 10.100.0.10'], port_security=['fa:16:3e:a5:e2:69 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '7b8cf31f-430b-4c7f-9c33-7d0cadd44d31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-423f0bba-22e2-4219-9338-a671dbe69e42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c184f0f2b71412fb560981314d0574d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8510d8eb-f367-43d1-be5f-8be0c3ab7e61', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eacee27-dbb3-4c60-a47d-c1f874faea06, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0d74b914-0dbd-4356-8304-a42943811e2e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:01:47 np0005546909 NetworkManager[55691]: <info>  [1764936107.7392] device (tap0d74b914-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:01:47 np0005546909 NetworkManager[55691]: <info>  [1764936107.7401] device (tap0d74b914-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:01:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:47Z|00134|binding|INFO|Setting lport 0d74b914-0dbd-4356-8304-a42943811e2e ovn-installed in OVS
Dec  5 07:01:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:47Z|00135|binding|INFO|Setting lport 0d74b914-0dbd-4356-8304-a42943811e2e up in Southbound
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.743 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.747 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:47 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [NOTICE]   (217354) : New worker (217357) forked
Dec  5 07:01:47 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [NOTICE]   (217354) : Loading success.
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.821 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0d74b914-0dbd-4356-8304-a42943811e2e in datapath 423f0bba-22e2-4219-9338-a671dbe69e42 unbound from our chassis#033[00m
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.824 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 423f0bba-22e2-4219-9338-a671dbe69e42#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.827 187212 DEBUG nova.virt.libvirt.driver [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.827 187212 DEBUG nova.virt.libvirt.driver [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.827 187212 DEBUG nova.virt.libvirt.driver [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No VIF found with MAC fa:16:3e:d4:b7:ec, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.827 187212 DEBUG nova.virt.libvirt.driver [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] No VIF found with MAC fa:16:3e:a5:e2:69, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.838 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ead51a60-412c-44c6-9080-b5d44619ffe0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.853 187212 DEBUG nova.virt.libvirt.guest [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  <nova:name>tempest-AttachInterfacesV270Test-server-1046212835</nova:name>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  <nova:creationTime>2025-12-05 12:01:47</nova:creationTime>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  <nova:flavor name="m1.nano">
Dec  5 07:01:47 np0005546909 nova_compute[187208]:    <nova:memory>128</nova:memory>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:    <nova:disk>1</nova:disk>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:    <nova:swap>0</nova:swap>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:    <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:    <nova:vcpus>1</nova:vcpus>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  </nova:flavor>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  <nova:owner>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:    <nova:user uuid="9a5b1ecad65045afbe3c154494417765">tempest-AttachInterfacesV270Test-1975383464-project-member</nova:user>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:    <nova:project uuid="2c184f0f2b71412fb560981314d0574d">tempest-AttachInterfacesV270Test-1975383464</nova:project>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  </nova:owner>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  <nova:ports>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:    <nova:port uuid="02d6eab5-4561-4d9f-ad9a-169b57667224">
Dec  5 07:01:47 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:    <nova:port uuid="0d74b914-0dbd-4356-8304-a42943811e2e">
Dec  5 07:01:47 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:01:47 np0005546909 nova_compute[187208]:  </nova:ports>
Dec  5 07:01:47 np0005546909 nova_compute[187208]: </nova:instance>
Dec  5 07:01:47 np0005546909 nova_compute[187208]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.876 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cccf12c8-68bd-4408-ab47-1d9ad70a8e4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.879 187212 DEBUG oslo_concurrency.lockutils [None req-9acdf767-30f7-4027-867b-e1358075141b 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "interface-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 10.902s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.880 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[98a64066-02ce-415b-9969-9beed3f64113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.913 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4bcc21-3945-49b0-a92d-8a98a72dadf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.936 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ecbc5671-bc03-42ef-bd47-1b417e14a599]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap423f0bba-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:51:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347615, 'reachable_time': 21997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217383, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.952 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[235eb5bd-0bb2-4bb0-a3d2-8ef482be95fb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap423f0bba-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347626, 'tstamp': 347626}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217384, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap423f0bba-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347629, 'tstamp': 347629}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217384, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.954 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap423f0bba-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.956 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:47 np0005546909 nova_compute[187208]: 2025-12-05 12:01:47.957 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.958 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap423f0bba-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.958 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.959 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap423f0bba-20, col_values=(('external_ids', {'iface-id': '8801ec73-6ce8-4039-ab6c-4693dcbc877e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:47.959 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:48Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a5:e2:69 10.100.0.10
Dec  5 07:01:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:48Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a5:e2:69 10.100.0.10
Dec  5 07:01:48 np0005546909 nova_compute[187208]: 2025-12-05 12:01:48.882 187212 DEBUG nova.network.neutron [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:01:49 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:49Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d4:b7:ec 10.100.0.5
Dec  5 07:01:49 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:49Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d4:b7:ec 10.100.0.5
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.374 187212 DEBUG nova.network.neutron [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Updated VIF entry in instance network info cache for port 78310fa8-21e8-49e5-8b60-867d1089ad71. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.375 187212 DEBUG nova.network.neutron [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Updating instance_info_cache with network_info: [{"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.395 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.396 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.396 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.397 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.397 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.397 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Processing event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.397 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-changed-0d74b914-0dbd-4356-8304-a42943811e2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.397 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Refreshing instance network info cache due to event network-changed-0d74b914-0dbd-4356-8304-a42943811e2e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.397 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.398 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.398 187212 DEBUG nova.network.neutron [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Refreshing network info cache for port 0d74b914-0dbd-4356-8304-a42943811e2e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.399 187212 DEBUG nova.compute.manager [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance event wait completed in 16 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.404 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936109.4036753, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.404 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.415 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.423 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance spawned successfully.#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.423 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.430 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.433 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.445 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.446 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.446 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.446 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.447 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.447 187212 DEBUG nova.virt.libvirt.driver [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.453 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.507 187212 DEBUG nova.compute.manager [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.568 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.568 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.568 187212 DEBUG nova.objects.instance [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:01:49 np0005546909 nova_compute[187208]: 2025-12-05 12:01:49.630 187212 DEBUG oslo_concurrency.lockutils [None req-2031f7c3-745b-4a49-9e6b-3c4740476f19 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:50 np0005546909 podman[217387]: 2025-12-05 12:01:50.227136162 +0000 UTC m=+0.069302451 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  5 07:01:50 np0005546909 podman[217407]: 2025-12-05 12:01:50.330562855 +0000 UTC m=+0.060225870 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7)
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.793 187212 DEBUG nova.network.neutron [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Updating instance_info_cache with network_info: [{"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.818 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Releasing lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.818 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Instance network_info: |[{"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.820 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Start _get_guest_xml network_info=[{"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.824 187212 WARNING nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.832 187212 DEBUG nova.virt.libvirt.host [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.834 187212 DEBUG nova.virt.libvirt.host [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.838 187212 DEBUG nova.virt.libvirt.host [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.838 187212 DEBUG nova.virt.libvirt.host [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.839 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.839 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.839 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.840 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.840 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.840 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.840 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.840 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.840 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.841 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.841 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.841 187212 DEBUG nova.virt.hardware [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.845 187212 DEBUG nova.virt.libvirt.vif [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-306695219',display_name='tempest-ServersTestManualDisk-server-306695219',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-306695219',id=26,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDBUGvAJW8rYe/hjaW/hFZe4neO1wzdrge/WiC/SnDk7t8/AXKetmZ8zo2NHECOEnhI/cR+zSyaxyLqYdEo4m6l7dGZQlwDucN9SIoLiq2LpSC0tXmPTDFsuOTXYjC2rzw==',key_name='tempest-keypair-2064130855',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e3cd52d70d1a4be8ae891298ff7e1018',ramdisk_id='',reservation_id='r-w3qpedx4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1916815153',owner_user_name='tempest-ServersTestManualDisk-1916815153-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ff53b25ec85543eeb2bdea04a6eeaac4',uuid=1606eea3-5389-4437-b0f9-cfe6084d7871,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.845 187212 DEBUG nova.network.os_vif_util [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Converting VIF {"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.846 187212 DEBUG nova.network.os_vif_util [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.848 187212 DEBUG nova.objects.instance [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1606eea3-5389-4437-b0f9-cfe6084d7871 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.852 187212 DEBUG nova.compute.manager [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-changed-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.852 187212 DEBUG nova.compute.manager [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Refreshing instance network info cache due to event network-changed-c72089e0-4937-40b6-86b5-f9d6d0982058. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.852 187212 DEBUG oslo_concurrency.lockutils [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.853 187212 DEBUG oslo_concurrency.lockutils [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.853 187212 DEBUG nova.network.neutron [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Refreshing network info cache for port c72089e0-4937-40b6-86b5-f9d6d0982058 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.868 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  <uuid>1606eea3-5389-4437-b0f9-cfe6084d7871</uuid>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  <name>instance-0000001a</name>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersTestManualDisk-server-306695219</nova:name>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:01:50</nova:creationTime>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:        <nova:user uuid="ff53b25ec85543eeb2bdea04a6eeaac4">tempest-ServersTestManualDisk-1916815153-project-member</nova:user>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:        <nova:project uuid="e3cd52d70d1a4be8ae891298ff7e1018">tempest-ServersTestManualDisk-1916815153</nova:project>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:        <nova:port uuid="c72089e0-4937-40b6-86b5-f9d6d0982058">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <entry name="serial">1606eea3-5389-4437-b0f9-cfe6084d7871</entry>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <entry name="uuid">1606eea3-5389-4437-b0f9-cfe6084d7871</entry>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.config"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:ea:73:d9"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <target dev="tapc72089e0-49"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/console.log" append="off"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:01:50 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:01:50 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:01:50 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:01:50 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.870 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Preparing to wait for external event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.870 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.870 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.870 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.871 187212 DEBUG nova.virt.libvirt.vif [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-306695219',display_name='tempest-ServersTestManualDisk-server-306695219',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-306695219',id=26,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDBUGvAJW8rYe/hjaW/hFZe4neO1wzdrge/WiC/SnDk7t8/AXKetmZ8zo2NHECOEnhI/cR+zSyaxyLqYdEo4m6l7dGZQlwDucN9SIoLiq2LpSC0tXmPTDFsuOTXYjC2rzw==',key_name='tempest-keypair-2064130855',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e3cd52d70d1a4be8ae891298ff7e1018',ramdisk_id='',reservation_id='r-w3qpedx4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1916815153',owner_user_name='tempest-ServersTestManualDisk-1916815153-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:01:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ff53b25ec85543eeb2bdea04a6eeaac4',uuid=1606eea3-5389-4437-b0f9-cfe6084d7871,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.872 187212 DEBUG nova.network.os_vif_util [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Converting VIF {"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.872 187212 DEBUG nova.network.os_vif_util [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.873 187212 DEBUG os_vif [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.877 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.877 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.878 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.886 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.886 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc72089e0-49, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.887 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc72089e0-49, col_values=(('external_ids', {'iface-id': 'c72089e0-4937-40b6-86b5-f9d6d0982058', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:73:d9', 'vm-uuid': '1606eea3-5389-4437-b0f9-cfe6084d7871'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.889 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:50 np0005546909 NetworkManager[55691]: <info>  [1764936110.8906] manager: (tapc72089e0-49): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.892 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.898 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.899 187212 INFO os_vif [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49')#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.904 187212 DEBUG nova.compute.manager [req-d2c101b6-f666-4e4a-8a9d-bee84bdc095a req-5d4a672a-f917-4c48-b292-3968e81f2242 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.904 187212 DEBUG oslo_concurrency.lockutils [req-d2c101b6-f666-4e4a-8a9d-bee84bdc095a req-5d4a672a-f917-4c48-b292-3968e81f2242 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.904 187212 DEBUG oslo_concurrency.lockutils [req-d2c101b6-f666-4e4a-8a9d-bee84bdc095a req-5d4a672a-f917-4c48-b292-3968e81f2242 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.905 187212 DEBUG oslo_concurrency.lockutils [req-d2c101b6-f666-4e4a-8a9d-bee84bdc095a req-5d4a672a-f917-4c48-b292-3968e81f2242 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.905 187212 DEBUG nova.compute.manager [req-d2c101b6-f666-4e4a-8a9d-bee84bdc095a req-5d4a672a-f917-4c48-b292-3968e81f2242 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.905 187212 WARNING nova.compute.manager [req-d2c101b6-f666-4e4a-8a9d-bee84bdc095a req-5d4a672a-f917-4c48-b292-3968e81f2242 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e for instance with vm_state active and task_state None.#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.967 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.968 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.969 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] No VIF found with MAC fa:16:3e:ea:73:d9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:01:50 np0005546909 nova_compute[187208]: 2025-12-05 12:01:50.970 187212 INFO nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Using config drive#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.665 187212 INFO nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Creating config drive at /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.config#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.673 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppq3rnbg_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.705 187212 DEBUG nova.network.neutron [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updated VIF entry in instance network info cache for port 0d74b914-0dbd-4356-8304-a42943811e2e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.706 187212 DEBUG nova.network.neutron [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updating instance_info_cache with network_info: [{"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.741 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.742 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.742 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.742 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.743 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.743 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.743 187212 WARNING nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state active and task_state rebuild_spawning.#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.743 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.743 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.743 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.744 187212 DEBUG oslo_concurrency.lockutils [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.744 187212 DEBUG nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.744 187212 WARNING nova.compute.manager [req-9bc569fd-8152-4524-883c-0a72e70dabac req-7332a4d1-5a8e-4364-be71-89cbd161141d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state active and task_state rebuild_spawning.#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.806 187212 DEBUG oslo_concurrency.processutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppq3rnbg_" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:01:51 np0005546909 kernel: tapc72089e0-49: entered promiscuous mode
Dec  5 07:01:51 np0005546909 NetworkManager[55691]: <info>  [1764936111.8715] manager: (tapc72089e0-49): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Dec  5 07:01:51 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:51Z|00136|binding|INFO|Claiming lport c72089e0-4937-40b6-86b5-f9d6d0982058 for this chassis.
Dec  5 07:01:51 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:51Z|00137|binding|INFO|c72089e0-4937-40b6-86b5-f9d6d0982058: Claiming fa:16:3e:ea:73:d9 10.100.0.11
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.875 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.894 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:73:d9 10.100.0.11'], port_security=['fa:16:3e:ea:73:d9 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1606eea3-5389-4437-b0f9-cfe6084d7871', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3cd52d70d1a4be8ae891298ff7e1018', 'neutron:revision_number': '2', 'neutron:security_group_ids': '753f16cd-17e0-4f5a-8936-b01e8b5b8119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1ba8a60-bda5-4c97-91b2-1ae7ea8aa092, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c72089e0-4937-40b6-86b5-f9d6d0982058) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:01:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.895 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c72089e0-4937-40b6-86b5-f9d6d0982058 in datapath 904b3233-fdc6-4df0-b02a-f30a1e47627b bound to our chassis#033[00m
Dec  5 07:01:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.900 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 904b3233-fdc6-4df0-b02a-f30a1e47627b#033[00m
Dec  5 07:01:51 np0005546909 systemd-udevd[217446]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:01:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.916 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b7921a3f-1195-4f40-9c02-49cc80573b16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.918 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap904b3233-f1 in ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:01:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.920 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap904b3233-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:01:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.920 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a60dfebf-19ee-418d-8cd9-856628b37e43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.921 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b1b0dede-c6a8-42c7-bd4a-4fc85e8eb800]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:51 np0005546909 NetworkManager[55691]: <info>  [1764936111.9363] device (tapc72089e0-49): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:01:51 np0005546909 NetworkManager[55691]: <info>  [1764936111.9380] device (tapc72089e0-49): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:01:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.938 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[43641ee1-cca4-40b6-9cea-4a956a5b60cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.946 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:51 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:51Z|00138|binding|INFO|Setting lport c72089e0-4937-40b6-86b5-f9d6d0982058 ovn-installed in OVS
Dec  5 07:01:51 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:51Z|00139|binding|INFO|Setting lport c72089e0-4937-40b6-86b5-f9d6d0982058 up in Southbound
Dec  5 07:01:51 np0005546909 nova_compute[187208]: 2025-12-05 12:01:51.953 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:51 np0005546909 systemd-machined[153543]: New machine qemu-30-instance-0000001a.
Dec  5 07:01:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:51.972 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43219dbe-4703-42e4-a897-bb84b5e90c17]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:51 np0005546909 systemd[1]: Started Virtual Machine qemu-30-instance-0000001a.
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.005 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f2b8c83f-c0a1-4d0f-b849-e64b9885dedf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.012 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b3718dcb-2bd1-4f25-b43d-af7e45569297]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:52 np0005546909 NetworkManager[55691]: <info>  [1764936112.0141] manager: (tap904b3233-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Dec  5 07:01:52 np0005546909 systemd-udevd[217449]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.046 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[16adb1cb-1293-48d4-aa59-8058223810f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.049 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e319f1-bb39-4e7f-bbc5-070a4cc18f5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:52 np0005546909 NetworkManager[55691]: <info>  [1764936112.0747] device (tap904b3233-f0): carrier: link connected
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.080 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4386b4fc-9866-4581-8f47-7ed0ba72c327]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.098 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bd98a545-ef0f-4648-b0f2-396ff7f797d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap904b3233-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fa:be:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349263, 'reachable_time': 42986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217481, 'error': None, 'target': 'ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.113 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[62649561-7638-44f8-83e3-1e17e4f026fc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefa:be1f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349263, 'tstamp': 349263}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217482, 'error': None, 'target': 'ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.131 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c284a64-221e-441a-87eb-b1345d8cfe7e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap904b3233-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fa:be:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349263, 'reachable_time': 42986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217483, 'error': None, 'target': 'ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.158 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[985c7a05-8e8f-4d92-ab44-e6286ce11c9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.226 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af935d5f-bf34-409d-9d4e-18bea0df1644]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.227 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap904b3233-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.228 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.228 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap904b3233-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:52 np0005546909 nova_compute[187208]: 2025-12-05 12:01:52.231 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:52 np0005546909 kernel: tap904b3233-f0: entered promiscuous mode
Dec  5 07:01:52 np0005546909 NetworkManager[55691]: <info>  [1764936112.2318] manager: (tap904b3233-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.234 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap904b3233-f0, col_values=(('external_ids', {'iface-id': '8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:01:52 np0005546909 ovn_controller[95610]: 2025-12-05T12:01:52Z|00140|binding|INFO|Releasing lport 8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef from this chassis (sb_readonly=0)
Dec  5 07:01:52 np0005546909 nova_compute[187208]: 2025-12-05 12:01:52.235 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:52 np0005546909 nova_compute[187208]: 2025-12-05 12:01:52.248 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.249 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/904b3233-fdc6-4df0-b02a-f30a1e47627b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/904b3233-fdc6-4df0-b02a-f30a1e47627b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.250 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e487b816-8c22-412f-8310-594e67ef8011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.251 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-904b3233-fdc6-4df0-b02a-f30a1e47627b
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/904b3233-fdc6-4df0-b02a-f30a1e47627b.pid.haproxy
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 904b3233-fdc6-4df0-b02a-f30a1e47627b
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:01:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:01:52.251 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'env', 'PROCESS_TAG=haproxy-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/904b3233-fdc6-4df0-b02a-f30a1e47627b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:01:52 np0005546909 nova_compute[187208]: 2025-12-05 12:01:52.388 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936112.387751, 1606eea3-5389-4437-b0f9-cfe6084d7871 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:52 np0005546909 nova_compute[187208]: 2025-12-05 12:01:52.388 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] VM Started (Lifecycle Event)#033[00m
Dec  5 07:01:52 np0005546909 nova_compute[187208]: 2025-12-05 12:01:52.407 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:52 np0005546909 nova_compute[187208]: 2025-12-05 12:01:52.414 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936112.388031, 1606eea3-5389-4437-b0f9-cfe6084d7871 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:52 np0005546909 nova_compute[187208]: 2025-12-05 12:01:52.415 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:01:52 np0005546909 nova_compute[187208]: 2025-12-05 12:01:52.448 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:52 np0005546909 nova_compute[187208]: 2025-12-05 12:01:52.454 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:52 np0005546909 nova_compute[187208]: 2025-12-05 12:01:52.481 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:01:52 np0005546909 nova_compute[187208]: 2025-12-05 12:01:52.500 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:52 np0005546909 podman[217520]: 2025-12-05 12:01:52.662053751 +0000 UTC m=+0.059216072 container create 98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:01:52 np0005546909 systemd[1]: Started libpod-conmon-98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1.scope.
Dec  5 07:01:52 np0005546909 podman[217520]: 2025-12-05 12:01:52.62559086 +0000 UTC m=+0.022753201 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:01:52 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:01:52 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f782e29042a4d6587770f14d2c44c7361a37482c8593077cf3a706cbf5d68ff/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:01:52 np0005546909 podman[217520]: 2025-12-05 12:01:52.758042253 +0000 UTC m=+0.155204604 container init 98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:01:52 np0005546909 podman[217520]: 2025-12-05 12:01:52.763936402 +0000 UTC m=+0.161098723 container start 98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:01:52 np0005546909 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [NOTICE]   (217539) : New worker (217541) forked
Dec  5 07:01:52 np0005546909 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [NOTICE]   (217539) : Loading success.
Dec  5 07:01:55 np0005546909 nova_compute[187208]: 2025-12-05 12:01:55.003 187212 DEBUG nova.network.neutron [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Updated VIF entry in instance network info cache for port c72089e0-4937-40b6-86b5-f9d6d0982058. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:01:55 np0005546909 nova_compute[187208]: 2025-12-05 12:01:55.003 187212 DEBUG nova.network.neutron [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Updating instance_info_cache with network_info: [{"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:01:55 np0005546909 nova_compute[187208]: 2025-12-05 12:01:55.037 187212 DEBUG oslo_concurrency.lockutils [req-d1fc86d7-7fa6-49ff-8b55-72fd2fdd65a3 req-fb8b709c-0c88-4e57-859e-14572b02b1ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:01:55 np0005546909 nova_compute[187208]: 2025-12-05 12:01:55.889 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:56 np0005546909 podman[217550]: 2025-12-05 12:01:56.200939669 +0000 UTC m=+0.051109092 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:01:56 np0005546909 podman[217551]: 2025-12-05 12:01:56.281305844 +0000 UTC m=+0.122315685 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible)
Dec  5 07:01:57 np0005546909 nova_compute[187208]: 2025-12-05 12:01:57.504 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.198 187212 DEBUG nova.compute.manager [req-c9bcc427-d786-40c0-bda5-2fafa224cf45 req-7ba33ad7-e2c6-4d12-87f5-cb0743a859e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.198 187212 DEBUG oslo_concurrency.lockutils [req-c9bcc427-d786-40c0-bda5-2fafa224cf45 req-7ba33ad7-e2c6-4d12-87f5-cb0743a859e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.199 187212 DEBUG oslo_concurrency.lockutils [req-c9bcc427-d786-40c0-bda5-2fafa224cf45 req-7ba33ad7-e2c6-4d12-87f5-cb0743a859e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.199 187212 DEBUG oslo_concurrency.lockutils [req-c9bcc427-d786-40c0-bda5-2fafa224cf45 req-7ba33ad7-e2c6-4d12-87f5-cb0743a859e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.199 187212 DEBUG nova.compute.manager [req-c9bcc427-d786-40c0-bda5-2fafa224cf45 req-7ba33ad7-e2c6-4d12-87f5-cb0743a859e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.199 187212 WARNING nova.compute.manager [req-c9bcc427-d786-40c0-bda5-2fafa224cf45 req-7ba33ad7-e2c6-4d12-87f5-cb0743a859e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e for instance with vm_state active and task_state None.#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.302 187212 DEBUG nova.compute.manager [req-2c360a00-f04e-40e6-a4f6-07985ce9d841 req-e4fc8619-91c0-4a8f-8c57-b06b806c1f08 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.303 187212 DEBUG oslo_concurrency.lockutils [req-2c360a00-f04e-40e6-a4f6-07985ce9d841 req-e4fc8619-91c0-4a8f-8c57-b06b806c1f08 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.303 187212 DEBUG oslo_concurrency.lockutils [req-2c360a00-f04e-40e6-a4f6-07985ce9d841 req-e4fc8619-91c0-4a8f-8c57-b06b806c1f08 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.304 187212 DEBUG oslo_concurrency.lockutils [req-2c360a00-f04e-40e6-a4f6-07985ce9d841 req-e4fc8619-91c0-4a8f-8c57-b06b806c1f08 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.304 187212 DEBUG nova.compute.manager [req-2c360a00-f04e-40e6-a4f6-07985ce9d841 req-e4fc8619-91c0-4a8f-8c57-b06b806c1f08 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Processing event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.305 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Instance event wait completed in 18 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.316 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936118.3163302, 1282e776-5758-493b-8f52-59839ebcd31b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.317 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.323 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.329 187212 INFO nova.virt.libvirt.driver [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Instance spawned successfully.#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.329 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.345 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.352 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.358 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.358 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.359 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.359 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.360 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.361 187212 DEBUG nova.virt.libvirt.driver [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.384 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.420 187212 INFO nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Took 33.13 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.420 187212 DEBUG nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.482 187212 INFO nova.compute.manager [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Took 33.63 seconds to build instance.#033[00m
Dec  5 07:01:58 np0005546909 nova_compute[187208]: 2025-12-05 12:01:58.503 187212 DEBUG oslo_concurrency.lockutils [None req-0eb28dc5-b90d-4171-b244-63bf82b8f86e 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 33.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:01:59 np0005546909 nova_compute[187208]: 2025-12-05 12:01:59.587 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:59 np0005546909 nova_compute[187208]: 2025-12-05 12:01:59.588 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:59 np0005546909 nova_compute[187208]: 2025-12-05 12:01:59.609 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:01:59 np0005546909 nova_compute[187208]: 2025-12-05 12:01:59.686 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:01:59 np0005546909 nova_compute[187208]: 2025-12-05 12:01:59.687 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:01:59 np0005546909 nova_compute[187208]: 2025-12-05 12:01:59.694 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:01:59 np0005546909 nova_compute[187208]: 2025-12-05 12:01:59.694 187212 INFO nova.compute.claims [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:01:59 np0005546909 nova_compute[187208]: 2025-12-05 12:01:59.993 187212 DEBUG nova.compute.provider_tree [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.010 187212 DEBUG nova.scheduler.client.report [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.031 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.031 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.124 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.125 187212 DEBUG nova.network.neutron [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.153 187212 INFO nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.174 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:02:00 np0005546909 podman[217598]: 2025-12-05 12:02:00.211981214 +0000 UTC m=+0.057873815 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.706 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.707 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.707 187212 INFO nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Creating image(s)#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.707 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "/var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.708 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.708 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.729 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.798 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.801 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.801 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.818 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.865 187212 DEBUG nova.policy [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.884 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.886 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:00 np0005546909 nova_compute[187208]: 2025-12-05 12:02:00.906 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.127 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.128 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.128 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.128 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.128 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.130 187212 INFO nova.compute.manager [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Terminating instance#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.131 187212 DEBUG nova.compute.manager [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.135 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk 1073741824" returned: 0 in 0.249s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.135 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.136 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:01 np0005546909 kernel: tap02d6eab5-45 (unregistering): left promiscuous mode
Dec  5 07:02:01 np0005546909 NetworkManager[55691]: <info>  [1764936121.1804] device (tap02d6eab5-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.190 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:01Z|00141|memory|INFO|peak resident set size grew 50% in last 985.0 seconds, from 16000 kB to 24064 kB
Dec  5 07:02:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:01Z|00142|memory|INFO|idl-cells-OVN_Southbound:13154 idl-cells-Open_vSwitch:1269 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:2 lflow-cache-entries-cache-expr:359 lflow-cache-entries-cache-matches:275 lflow-cache-size-KB:1528 local_datapath_usage-KB:2 ofctrl_desired_flow_usage-KB:563 ofctrl_installed_flow_usage-KB:410 ofctrl_sb_flow_ref_usage-KB:211
Dec  5 07:02:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:01Z|00143|binding|INFO|Releasing lport 02d6eab5-4561-4d9f-ad9a-169b57667224 from this chassis (sb_readonly=0)
Dec  5 07:02:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:01Z|00144|binding|INFO|Setting lport 02d6eab5-4561-4d9f-ad9a-169b57667224 down in Southbound
Dec  5 07:02:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:01Z|00145|binding|INFO|Removing iface tap02d6eab5-45 ovn-installed in OVS
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.194 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.202 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:b7:ec 10.100.0.5'], port_security=['fa:16:3e:d4:b7:ec 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7b8cf31f-430b-4c7f-9c33-7d0cadd44d31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-423f0bba-22e2-4219-9338-a671dbe69e42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c184f0f2b71412fb560981314d0574d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8510d8eb-f367-43d1-be5f-8be0c3ab7e61', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eacee27-dbb3-4c60-a47d-c1f874faea06, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=02d6eab5-4561-4d9f-ad9a-169b57667224) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.204 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 02d6eab5-4561-4d9f-ad9a-169b57667224 in datapath 423f0bba-22e2-4219-9338-a671dbe69e42 unbound from our chassis#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.207 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 423f0bba-22e2-4219-9338-a671dbe69e42#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.210 187212 DEBUG nova.compute.manager [req-fd098149-5f4b-42d5-bd2a-29e251fa4012 req-6dcc80b6-ab76-4804-89b8-11ac50f1ec51 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.211 187212 DEBUG oslo_concurrency.lockutils [req-fd098149-5f4b-42d5-bd2a-29e251fa4012 req-6dcc80b6-ab76-4804-89b8-11ac50f1ec51 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.211 187212 DEBUG oslo_concurrency.lockutils [req-fd098149-5f4b-42d5-bd2a-29e251fa4012 req-6dcc80b6-ab76-4804-89b8-11ac50f1ec51 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.211 187212 DEBUG oslo_concurrency.lockutils [req-fd098149-5f4b-42d5-bd2a-29e251fa4012 req-6dcc80b6-ab76-4804-89b8-11ac50f1ec51 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.211 187212 DEBUG nova.compute.manager [req-fd098149-5f4b-42d5-bd2a-29e251fa4012 req-6dcc80b6-ab76-4804-89b8-11ac50f1ec51 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] No waiting events found dispatching network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.212 187212 WARNING nova.compute.manager [req-fd098149-5f4b-42d5-bd2a-29e251fa4012 req-6dcc80b6-ab76-4804-89b8-11ac50f1ec51 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received unexpected event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d for instance with vm_state active and task_state None.#033[00m
Dec  5 07:02:01 np0005546909 kernel: tap0d74b914-0d (unregistering): left promiscuous mode
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.212 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 NetworkManager[55691]: <info>  [1764936121.2165] device (tap0d74b914-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:02:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:01Z|00146|binding|INFO|Releasing lport 0d74b914-0dbd-4356-8304-a42943811e2e from this chassis (sb_readonly=0)
Dec  5 07:02:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:01Z|00147|binding|INFO|Setting lport 0d74b914-0dbd-4356-8304-a42943811e2e down in Southbound
Dec  5 07:02:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:01Z|00148|binding|INFO|Removing iface tap0d74b914-0d ovn-installed in OVS
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.225 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.226 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.227 187212 DEBUG nova.virt.disk.api [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Checking if we can resize image /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.227 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.234 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a5:e2:69 10.100.0.10'], port_security=['fa:16:3e:a5:e2:69 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '7b8cf31f-430b-4c7f-9c33-7d0cadd44d31', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-423f0bba-22e2-4219-9338-a671dbe69e42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c184f0f2b71412fb560981314d0574d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8510d8eb-f367-43d1-be5f-8be0c3ab7e61', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eacee27-dbb3-4c60-a47d-c1f874faea06, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0d74b914-0dbd-4356-8304-a42943811e2e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.246 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[54f70b11-5508-4fee-b54f-0b575c943b75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.248 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000018.scope: Deactivated successfully.
Dec  5 07:02:01 np0005546909 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000018.scope: Consumed 13.107s CPU time.
Dec  5 07:02:01 np0005546909 systemd-machined[153543]: Machine qemu-27-instance-00000018 terminated.
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.288 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ad5ff3e4-3c1e-4dee-bf6f-7cab6b527aba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.292 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.292 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f55e60f2-c537-4c6e-a1b6-733ac8bc2bee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.292 187212 DEBUG nova.virt.disk.api [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Cannot resize image /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.293 187212 DEBUG nova.objects.instance [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'migration_context' on Instance uuid 795a269a-5af9-4e6a-bf1f-e2bb83634855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.308 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.309 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Ensure instance console log exists: /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.309 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.310 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.310 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.321 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[50a59568-b576-4592-bd21-7ae8ed94b570]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.340 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[50f1699e-5b96-49d6-963d-533a87cde544]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap423f0bba-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:51:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347615, 'reachable_time': 21997, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217670, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.357 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 NetworkManager[55691]: <info>  [1764936121.3657] manager: (tap0d74b914-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.365 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dbffeae6-9b11-4b9a-88b5-ff812fe6f18f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap423f0bba-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347626, 'tstamp': 347626}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217673, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap423f0bba-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347629, 'tstamp': 347629}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217673, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.368 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap423f0bba-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.370 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.380 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.395 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.396 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap423f0bba-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.396 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.396 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap423f0bba-20, col_values=(('external_ids', {'iface-id': '8801ec73-6ce8-4039-ab6c-4693dcbc877e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.397 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.399 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0d74b914-0dbd-4356-8304-a42943811e2e in datapath 423f0bba-22e2-4219-9338-a671dbe69e42 unbound from our chassis#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.401 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 423f0bba-22e2-4219-9338-a671dbe69e42, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.405 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[279203da-9479-4f09-82eb-24fecb833e2b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.407 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42 namespace which is not needed anymore#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.423 187212 INFO nova.virt.libvirt.driver [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Instance destroyed successfully.#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.423 187212 DEBUG nova.objects.instance [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lazy-loading 'resources' on Instance uuid 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.436 187212 DEBUG nova.virt.libvirt.vif [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1046212835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1046212835',id=24,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2c184f0f2b71412fb560981314d0574d',ramdisk_id='',reservation_id='r-gepf0n33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1975383464',owner_user_name='tempest-AttachInterfacesV270Test-1975383464-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:01:35Z,user_data=None,user_id='9a5b1ecad65045afbe3c154494417765',uuid=7b8cf31f-430b-4c7f-9c33-7d0cadd44d31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.437 187212 DEBUG nova.network.os_vif_util [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converting VIF {"id": "02d6eab5-4561-4d9f-ad9a-169b57667224", "address": "fa:16:3e:d4:b7:ec", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap02d6eab5-45", "ovs_interfaceid": "02d6eab5-4561-4d9f-ad9a-169b57667224", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.438 187212 DEBUG nova.network.os_vif_util [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.438 187212 DEBUG os_vif [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.441 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.441 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02d6eab5-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.443 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.446 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.448 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.451 187212 INFO os_vif [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:b7:ec,bridge_name='br-int',has_traffic_filtering=True,id=02d6eab5-4561-4d9f-ad9a-169b57667224,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap02d6eab5-45')#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.454 187212 DEBUG nova.virt.libvirt.vif [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1046212835',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1046212835',id=24,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2c184f0f2b71412fb560981314d0574d',ramdisk_id='',reservation_id='r-gepf0n33',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1975383464',owner_user_name='tempest-AttachInterfacesV270Test-1975383464-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:01:35Z,user_data=None,user_id='9a5b1ecad65045afbe3c154494417765',uuid=7b8cf31f-430b-4c7f-9c33-7d0cadd44d31,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.455 187212 DEBUG nova.network.os_vif_util [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converting VIF {"id": "0d74b914-0dbd-4356-8304-a42943811e2e", "address": "fa:16:3e:a5:e2:69", "network": {"id": "423f0bba-22e2-4219-9338-a671dbe69e42", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1652559979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c184f0f2b71412fb560981314d0574d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0d74b914-0d", "ovs_interfaceid": "0d74b914-0dbd-4356-8304-a42943811e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.455 187212 DEBUG nova.network.os_vif_util [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.456 187212 DEBUG os_vif [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.457 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.457 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d74b914-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.459 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.460 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.464 187212 INFO os_vif [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a5:e2:69,bridge_name='br-int',has_traffic_filtering=True,id=0d74b914-0dbd-4356-8304-a42943811e2e,network=Network(423f0bba-22e2-4219-9338-a671dbe69e42),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0d74b914-0d')#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.464 187212 INFO nova.virt.libvirt.driver [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Deleting instance files /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31_del#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.465 187212 INFO nova.virt.libvirt.driver [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Deletion of /var/lib/nova/instances/7b8cf31f-430b-4c7f-9c33-7d0cadd44d31_del complete#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.516 187212 INFO nova.compute.manager [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.516 187212 DEBUG oslo.service.loopingcall [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.517 187212 DEBUG nova.compute.manager [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.517 187212 DEBUG nova.network.neutron [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:02:01 np0005546909 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [NOTICE]   (217003) : haproxy version is 2.8.14-c23fe91
Dec  5 07:02:01 np0005546909 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [NOTICE]   (217003) : path to executable is /usr/sbin/haproxy
Dec  5 07:02:01 np0005546909 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [WARNING]  (217003) : Exiting Master process...
Dec  5 07:02:01 np0005546909 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [WARNING]  (217003) : Exiting Master process...
Dec  5 07:02:01 np0005546909 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [ALERT]    (217003) : Current worker (217006) exited with code 143 (Terminated)
Dec  5 07:02:01 np0005546909 neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42[216985]: [WARNING]  (217003) : All workers exited. Exiting... (0)
Dec  5 07:02:01 np0005546909 systemd[1]: libpod-719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8.scope: Deactivated successfully.
Dec  5 07:02:01 np0005546909 conmon[216985]: conmon 719877fc51b88939aab7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8.scope/container/memory.events
Dec  5 07:02:01 np0005546909 podman[217712]: 2025-12-05 12:02:01.548802863 +0000 UTC m=+0.044955846 container died 719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  5 07:02:01 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8-userdata-shm.mount: Deactivated successfully.
Dec  5 07:02:01 np0005546909 systemd[1]: var-lib-containers-storage-overlay-5c6b011a5a4020501a80db3fcb573c57c4bdcfb8d5de9a077c6de3d75c9302b5-merged.mount: Deactivated successfully.
Dec  5 07:02:01 np0005546909 podman[217712]: 2025-12-05 12:02:01.5878937 +0000 UTC m=+0.084046703 container cleanup 719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:02:01 np0005546909 systemd[1]: libpod-conmon-719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8.scope: Deactivated successfully.
Dec  5 07:02:01 np0005546909 podman[217737]: 2025-12-05 12:02:01.660487623 +0000 UTC m=+0.050717150 container remove 719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.672 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[575cb5f6-d59b-4035-80c4-c07c529d9f59]: (4, ('Fri Dec  5 12:02:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42 (719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8)\n719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8\nFri Dec  5 12:02:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42 (719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8)\n719877fc51b88939aab72787415feaf65808c1553646816a7b651ddad4ba97e8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.674 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[87402e80-0338-487c-bea3-4dc740afd971]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.675 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap423f0bba-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:01 np0005546909 kernel: tap423f0bba-20: left promiscuous mode
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.719 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.729 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.732 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9846cd4e-693b-4049-b163-70bb0351ba6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.744 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a3116a-284b-445c-8d91-d270642ebf71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.745 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5bfb812d-1899-48f8-967c-4c0d49c80017]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.759 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b07cd0ba-f06e-4d2f-b4f1-b816a193b7b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347608, 'reachable_time': 27761, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217750, 'error': None, 'target': 'ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.761 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-423f0bba-22e2-4219-9338-a671dbe69e42 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:02:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:01.761 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[33d6e22a-cc43-4dee-86ac-c0f8cf251e34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:01 np0005546909 systemd[1]: run-netns-ovnmeta\x2d423f0bba\x2d22e2\x2d4219\x2d9338\x2da671dbe69e42.mount: Deactivated successfully.
Dec  5 07:02:01 np0005546909 nova_compute[187208]: 2025-12-05 12:02:01.972 187212 DEBUG nova.network.neutron [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Successfully created port: f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:02:02 np0005546909 nova_compute[187208]: 2025-12-05 12:02:02.508 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:02Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:4f:38 10.100.0.13
Dec  5 07:02:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:02Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:4f:38 10.100.0.13
Dec  5 07:02:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.009 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.009 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.010 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.143 187212 DEBUG nova.network.neutron [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.161 187212 INFO nova.compute.manager [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Took 1.64 seconds to deallocate network for instance.#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.203 187212 DEBUG nova.network.neutron [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Successfully updated port: f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.207 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.207 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.217 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "refresh_cache-795a269a-5af9-4e6a-bf1f-e2bb83634855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.218 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquired lock "refresh_cache-795a269a-5af9-4e6a-bf1f-e2bb83634855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.218 187212 DEBUG nova.network.neutron [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.367 187212 DEBUG nova.compute.provider_tree [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.383 187212 DEBUG nova.scheduler.client.report [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.407 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.440 187212 INFO nova.scheduler.client.report [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Deleted allocations for instance 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.487 187212 DEBUG nova.network.neutron [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.493 187212 DEBUG oslo_concurrency.lockutils [None req-dc4fbd56-2f9e-421f-b04c-8ac1b83b9d05 9a5b1ecad65045afbe3c154494417765 2c184f0f2b71412fb560981314d0574d - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.547 187212 DEBUG nova.compute.manager [req-11ef507f-db4b-4cca-ad58-a17b4df3c742 req-a8e529a5-47bf-4edf-ad5c-ff15f1b60506 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-unplugged-02d6eab5-4561-4d9f-ad9a-169b57667224 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.548 187212 DEBUG oslo_concurrency.lockutils [req-11ef507f-db4b-4cca-ad58-a17b4df3c742 req-a8e529a5-47bf-4edf-ad5c-ff15f1b60506 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.548 187212 DEBUG oslo_concurrency.lockutils [req-11ef507f-db4b-4cca-ad58-a17b4df3c742 req-a8e529a5-47bf-4edf-ad5c-ff15f1b60506 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.548 187212 DEBUG oslo_concurrency.lockutils [req-11ef507f-db4b-4cca-ad58-a17b4df3c742 req-a8e529a5-47bf-4edf-ad5c-ff15f1b60506 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.548 187212 DEBUG nova.compute.manager [req-11ef507f-db4b-4cca-ad58-a17b4df3c742 req-a8e529a5-47bf-4edf-ad5c-ff15f1b60506 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-unplugged-02d6eab5-4561-4d9f-ad9a-169b57667224 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.549 187212 WARNING nova.compute.manager [req-11ef507f-db4b-4cca-ad58-a17b4df3c742 req-a8e529a5-47bf-4edf-ad5c-ff15f1b60506 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-unplugged-02d6eab5-4561-4d9f-ad9a-169b57667224 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.727 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.727 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.728 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.728 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.728 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.729 187212 INFO nova.compute.manager [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Terminating instance#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.730 187212 DEBUG nova.compute.manager [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:02:03 np0005546909 kernel: tap9bb4b8ce-57 (unregistering): left promiscuous mode
Dec  5 07:02:03 np0005546909 NetworkManager[55691]: <info>  [1764936123.7490] device (tap9bb4b8ce-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:02:03 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:03Z|00149|binding|INFO|Releasing lport 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d from this chassis (sb_readonly=0)
Dec  5 07:02:03 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:03Z|00150|binding|INFO|Setting lport 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d down in Southbound
Dec  5 07:02:03 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:03Z|00151|binding|INFO|Removing iface tap9bb4b8ce-57 ovn-installed in OVS
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.753 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.758 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:f0:53 10.100.0.14'], port_security=['fa:16:3e:06:f0:53 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1282e776-5758-493b-8f52-59839ebcd31b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-455bb7e1-6680-472e-861f-da50aef09a7f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8400e354e93c4b33b8d683012dfe5c94', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9474b356-5c55-44a1-af48-0eeaf9a9ad0b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07a9feeb-8467-4a6f-b0e2-fda2f133d3ac, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.759 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9bb4b8ce-5722-4698-aa3d-6d891ab14b0d in datapath 455bb7e1-6680-472e-861f-da50aef09a7f unbound from our chassis#033[00m
Dec  5 07:02:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.762 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 455bb7e1-6680-472e-861f-da50aef09a7f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:02:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.762 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a69778-8747-484a-ae23-e8552908e5dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.763 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f namespace which is not needed anymore#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.769 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:03 np0005546909 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000017.scope: Deactivated successfully.
Dec  5 07:02:03 np0005546909 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000017.scope: Consumed 6.174s CPU time.
Dec  5 07:02:03 np0005546909 systemd-machined[153543]: Machine qemu-28-instance-00000017 terminated.
Dec  5 07:02:03 np0005546909 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [NOTICE]   (217165) : haproxy version is 2.8.14-c23fe91
Dec  5 07:02:03 np0005546909 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [NOTICE]   (217165) : path to executable is /usr/sbin/haproxy
Dec  5 07:02:03 np0005546909 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [WARNING]  (217165) : Exiting Master process...
Dec  5 07:02:03 np0005546909 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [ALERT]    (217165) : Current worker (217168) exited with code 143 (Terminated)
Dec  5 07:02:03 np0005546909 neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f[217155]: [WARNING]  (217165) : All workers exited. Exiting... (0)
Dec  5 07:02:03 np0005546909 systemd[1]: libpod-62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b.scope: Deactivated successfully.
Dec  5 07:02:03 np0005546909 podman[217773]: 2025-12-05 12:02:03.887582796 +0000 UTC m=+0.042516395 container died 62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:02:03 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b-userdata-shm.mount: Deactivated successfully.
Dec  5 07:02:03 np0005546909 systemd[1]: var-lib-containers-storage-overlay-4149b47c4e6793ce3b6fc5ffdd499aa39b6bd1d4bb7bbc0659f951080559deea-merged.mount: Deactivated successfully.
Dec  5 07:02:03 np0005546909 podman[217773]: 2025-12-05 12:02:03.92061062 +0000 UTC m=+0.075544219 container cleanup 62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:02:03 np0005546909 systemd[1]: libpod-conmon-62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b.scope: Deactivated successfully.
Dec  5 07:02:03 np0005546909 podman[217801]: 2025-12-05 12:02:03.991296429 +0000 UTC m=+0.050219125 container remove 62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.995 187212 INFO nova.virt.libvirt.driver [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Instance destroyed successfully.#033[00m
Dec  5 07:02:03 np0005546909 nova_compute[187208]: 2025-12-05 12:02:03.996 187212 DEBUG nova.objects.instance [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lazy-loading 'resources' on Instance uuid 1282e776-5758-493b-8f52-59839ebcd31b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:03.998 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3b37aedc-58bc-45c5-9a79-60348a8be8f1]: (4, ('Fri Dec  5 12:02:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f (62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b)\n62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b\nFri Dec  5 12:02:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f (62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b)\n62b3a4fd01893b57ee707f34db0bccf88cc270f6be10bca3918254877c60894b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.000 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0b93f904-a312-4f29-b7e8-c3590970774f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.000 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap455bb7e1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.013 187212 DEBUG nova.virt.libvirt.vif [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1341604448',display_name='tempest-ImagesNegativeTestJSON-server-1341604448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1341604448',id=23,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8400e354e93c4b33b8d683012dfe5c94',ramdisk_id='',reservation_id='r-7yvjshh1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesNegativeTestJSON-965315619',owner_user_name='tempest-ImagesNegativeTestJSON-965315619-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:01:58Z,user_data=None,user_id='496da6872d53413ea1c201178cf5b05c',uuid=1282e776-5758-493b-8f52-59839ebcd31b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.013 187212 DEBUG nova.network.os_vif_util [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Converting VIF {"id": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "address": "fa:16:3e:06:f0:53", "network": {"id": "455bb7e1-6680-472e-861f-da50aef09a7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1464272812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8400e354e93c4b33b8d683012dfe5c94", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9bb4b8ce-57", "ovs_interfaceid": "9bb4b8ce-5722-4698-aa3d-6d891ab14b0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.014 187212 DEBUG nova.network.os_vif_util [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.014 187212 DEBUG os_vif [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.015 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.016 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bb4b8ce-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:04 np0005546909 kernel: tap455bb7e1-60: left promiscuous mode
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.017 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.020 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.023 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.025 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.027 187212 INFO os_vif [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:f0:53,bridge_name='br-int',has_traffic_filtering=True,id=9bb4b8ce-5722-4698-aa3d-6d891ab14b0d,network=Network(455bb7e1-6680-472e-861f-da50aef09a7f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9bb4b8ce-57')#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.028 187212 INFO nova.virt.libvirt.driver [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Deleting instance files /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b_del#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.028 187212 INFO nova.virt.libvirt.driver [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Deletion of /var/lib/nova/instances/1282e776-5758-493b-8f52-59839ebcd31b_del complete#033[00m
Dec  5 07:02:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.029 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b201cd-b475-46c5-9d94-b6f4bfb56e8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.046 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[14eaf412-4fb8-446f-b53b-4a06c770bf7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.048 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a9961fc1-39cf-496b-b269-21249f88224a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.064 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c74d7cc5-1b1b-417d-80f4-8c6360141032]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347938, 'reachable_time': 40453, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217832, 'error': None, 'target': 'ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.066 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-455bb7e1-6680-472e-861f-da50aef09a7f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:02:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:04.066 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[e87673e4-29f6-4ba7-8413-3335c80c3937]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:04 np0005546909 systemd[1]: run-netns-ovnmeta\x2d455bb7e1\x2d6680\x2d472e\x2d861f\x2dda50aef09a7f.mount: Deactivated successfully.
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.092 187212 INFO nova.compute.manager [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.092 187212 DEBUG oslo.service.loopingcall [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.093 187212 DEBUG nova.compute.manager [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.093 187212 DEBUG nova.network.neutron [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.760 187212 DEBUG nova.compute.manager [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received event network-changed-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.761 187212 DEBUG nova.compute.manager [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Refreshing instance network info cache due to event network-changed-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.761 187212 DEBUG oslo_concurrency.lockutils [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-795a269a-5af9-4e6a-bf1f-e2bb83634855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.918 187212 DEBUG nova.network.neutron [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Updating instance_info_cache with network_info: [{"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.951 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Releasing lock "refresh_cache-795a269a-5af9-4e6a-bf1f-e2bb83634855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.951 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Instance network_info: |[{"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.951 187212 DEBUG oslo_concurrency.lockutils [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-795a269a-5af9-4e6a-bf1f-e2bb83634855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.952 187212 DEBUG nova.network.neutron [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Refreshing network info cache for port f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.955 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Start _get_guest_xml network_info=[{"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.959 187212 WARNING nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.963 187212 DEBUG nova.virt.libvirt.host [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.963 187212 DEBUG nova.virt.libvirt.host [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.966 187212 DEBUG nova.virt.libvirt.host [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.966 187212 DEBUG nova.virt.libvirt.host [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.967 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.967 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.968 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.968 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.969 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.969 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.969 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.969 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.969 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.970 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.970 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.970 187212 DEBUG nova.virt.hardware [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.975 187212 DEBUG nova.virt.libvirt.vif [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1602902542',display_name='tempest-ImagesTestJSON-server-1602902542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1602902542',id=27,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-5xs2qi13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:00Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=795a269a-5af9-4e6a-bf1f-e2bb83634855,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.975 187212 DEBUG nova.network.os_vif_util [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.976 187212 DEBUG nova.network.os_vif_util [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.976 187212 DEBUG nova.objects.instance [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid 795a269a-5af9-4e6a-bf1f-e2bb83634855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:04 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.996 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  <uuid>795a269a-5af9-4e6a-bf1f-e2bb83634855</uuid>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  <name>instance-0000001b</name>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <nova:name>tempest-ImagesTestJSON-server-1602902542</nova:name>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:02:04</nova:creationTime>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:02:04 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:        <nova:user uuid="a00ac4435e6647779ffaf4a5cde18fdb">tempest-ImagesTestJSON-276789408-project-member</nova:user>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:        <nova:project uuid="43e63f5c6b0f4840ad4df23fb5c10764">tempest-ImagesTestJSON-276789408</nova:project>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:        <nova:port uuid="f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa">
Dec  5 07:02:04 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <entry name="serial">795a269a-5af9-4e6a-bf1f-e2bb83634855</entry>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <entry name="uuid">795a269a-5af9-4e6a-bf1f-e2bb83634855</entry>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.config"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:5f:9d:39"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <target dev="tapf6fc1ec5-ea"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:02:04 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/console.log" append="off"/>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:02:04 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:02:05 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:02:05 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:02:05 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:02:05 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:02:05 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.997 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Preparing to wait for external event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.997 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.997 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.997 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.998 187212 DEBUG nova.virt.libvirt.vif [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:01:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1602902542',display_name='tempest-ImagesTestJSON-server-1602902542',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1602902542',id=27,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-5xs2qi13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:00Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=795a269a-5af9-4e6a-bf1f-e2bb83634855,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.998 187212 DEBUG nova.network.os_vif_util [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.999 187212 DEBUG nova.network.os_vif_util [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.999 187212 DEBUG os_vif [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:04.999 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.000 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.000 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.003 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6fc1ec5-ea, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.003 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf6fc1ec5-ea, col_values=(('external_ids', {'iface-id': 'f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:9d:39', 'vm-uuid': '795a269a-5af9-4e6a-bf1f-e2bb83634855'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.005 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:05 np0005546909 NetworkManager[55691]: <info>  [1764936125.0069] manager: (tapf6fc1ec5-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.007 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.009 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.010 187212 INFO os_vif [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea')#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.064 187212 DEBUG nova.network.neutron [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.067 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.068 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.068 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No VIF found with MAC fa:16:3e:5f:9d:39, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.069 187212 INFO nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Using config drive#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.080 187212 INFO nova.compute.manager [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Took 0.99 seconds to deallocate network for instance.#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.128 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.128 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.326 187212 DEBUG nova.compute.provider_tree [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.350 187212 DEBUG nova.scheduler.client.report [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.393 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.428 187212 INFO nova.scheduler.client.report [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Deleted allocations for instance 1282e776-5758-493b-8f52-59839ebcd31b#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.485 187212 INFO nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Creating config drive at /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.config#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.490 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz5h6rcr6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.523 187212 DEBUG oslo_concurrency.lockutils [None req-6f460616-4496-4118-ba79-c9ea7710e53f 496da6872d53413ea1c201178cf5b05c 8400e354e93c4b33b8d683012dfe5c94 - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.619 187212 DEBUG oslo_concurrency.processutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz5h6rcr6" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:05 np0005546909 kernel: tapf6fc1ec5-ea: entered promiscuous mode
Dec  5 07:02:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:05Z|00152|binding|INFO|Claiming lport f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa for this chassis.
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.669 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:05Z|00153|binding|INFO|f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa: Claiming fa:16:3e:5f:9d:39 10.100.0.12
Dec  5 07:02:05 np0005546909 NetworkManager[55691]: <info>  [1764936125.6703] manager: (tapf6fc1ec5-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.681 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:9d:39 10.100.0.12'], port_security=['fa:16:3e:5f:9d:39 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '795a269a-5af9-4e6a-bf1f-e2bb83634855', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.682 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.686 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.696 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5db2eb61-a2a5-4ff2-858b-fa1e9cdb653f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.697 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41b3b495-c1 in ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.699 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41b3b495-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.699 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[befe14be-fc4e-4392-bb99-ec1e067fbbd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 systemd-udevd[217850]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.700 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9cdbd98a-e977-4591-b4d7-dabbe74f3a6d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.711 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[bbce06b2-50dd-46cc-90b8-3e3823f14823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 NetworkManager[55691]: <info>  [1764936125.7139] device (tapf6fc1ec5-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:02:05 np0005546909 NetworkManager[55691]: <info>  [1764936125.7150] device (tapf6fc1ec5-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:02:05 np0005546909 systemd-machined[153543]: New machine qemu-31-instance-0000001b.
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.745 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f496c5b-03df-4ccd-a0b3-435b84b93e6f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 systemd[1]: Started Virtual Machine qemu-31-instance-0000001b.
Dec  5 07:02:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:05Z|00154|binding|INFO|Setting lport f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa ovn-installed in OVS
Dec  5 07:02:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:05Z|00155|binding|INFO|Setting lport f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa up in Southbound
Dec  5 07:02:05 np0005546909 nova_compute[187208]: 2025-12-05 12:02:05.753 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.779 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[940c4343-60bc-46dc-b82d-fd06fb546902]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 NetworkManager[55691]: <info>  [1764936125.7848] manager: (tap41b3b495-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.784 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ba6f5249-65ab-4c9c-9bd0-689454f2b320]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.820 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[832bc666-fd1a-4eed-ad6a-ee46ae4bd158]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.824 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9997b021-a840-4646-ad02-8a4db6cfc6b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 NetworkManager[55691]: <info>  [1764936125.8538] device (tap41b3b495-c0): carrier: link connected
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.860 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b8fe456d-9b33-4f3b-b2fc-ca588d1a2839]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.879 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e3be7b72-a1bd-4f36-b0be-601e815d6836]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350641, 'reachable_time': 41239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217885, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.894 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6de8f94f-fd46-4511-957a-a2be46051bdf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:a102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 350641, 'tstamp': 350641}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217886, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.914 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4e82f9-3aa1-4570-b1d9-79c0def44f20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350641, 'reachable_time': 41239, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217887, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:05.957 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[955c4df0-1ef8-477d-8dd9-cc910936dd29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.017 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9addf6a9-7ff2-4a6d-8dfe-f051b9db6452]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.019 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.019 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.019 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:06 np0005546909 kernel: tap41b3b495-c0: entered promiscuous mode
Dec  5 07:02:06 np0005546909 NetworkManager[55691]: <info>  [1764936126.0238] manager: (tap41b3b495-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.021 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.025 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.026 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:06Z|00156|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.028 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.029 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[84d23969-5818-4ee9-9f39-ed02e2e99fd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.030 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.032 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'env', 'PROCESS_TAG=haproxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41b3b495-c1c9-44c0-b1a3-a499df6548dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.039 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.100 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936126.0987484, 795a269a-5af9-4e6a-bf1f-e2bb83634855 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.100 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] VM Started (Lifecycle Event)#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.121 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.125 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936126.100037, 795a269a-5af9-4e6a-bf1f-e2bb83634855 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.126 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.143 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.151 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.174 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:02:06 np0005546909 podman[217926]: 2025-12-05 12:02:06.399425683 +0000 UTC m=+0.045103970 container create 946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:02:06 np0005546909 systemd[1]: Started libpod-conmon-946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3.scope.
Dec  5 07:02:06 np0005546909 podman[217926]: 2025-12-05 12:02:06.375558101 +0000 UTC m=+0.021236408 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:02:06 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:02:06 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55714bde8320c7ed878ac9913f0bc25c165a8c6293c219aeaa49515054bdec12/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:02:06 np0005546909 podman[217926]: 2025-12-05 12:02:06.504090873 +0000 UTC m=+0.149769180 container init 946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:02:06 np0005546909 podman[217926]: 2025-12-05 12:02:06.511080833 +0000 UTC m=+0.156759120 container start 946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 07:02:06 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [NOTICE]   (217946) : New worker (217948) forked
Dec  5 07:02:06 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [NOTICE]   (217946) : Loading success.
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.736 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.737 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.738 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.738 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.739 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.740 187212 INFO nova.compute.manager [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Terminating instance#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.741 187212 DEBUG nova.compute.manager [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:02:06 np0005546909 kernel: tap47612a1a-e4 (unregistering): left promiscuous mode
Dec  5 07:02:06 np0005546909 NetworkManager[55691]: <info>  [1764936126.7773] device (tap47612a1a-e4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:02:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:06Z|00157|binding|INFO|Releasing lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e from this chassis (sb_readonly=0)
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.787 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:06Z|00158|binding|INFO|Setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e down in Southbound
Dec  5 07:02:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:06Z|00159|binding|INFO|Removing iface tap47612a1a-e4 ovn-installed in OVS
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.790 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.798 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:e2:12 10.100.0.10'], port_security=['fa:16:3e:45:e2:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd95c0324-d1d3-4960-9ab7-3a2a098a9f7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=47612a1a-e470-434b-927c-8fcd6c2fbe4e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.800 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.800 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 47612a1a-e470-434b-927c-8fcd6c2fbe4e in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.804 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.817 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e359f209-53bb-4263-99d5-4084125dc106]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:06 np0005546909 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000013.scope: Deactivated successfully.
Dec  5 07:02:06 np0005546909 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000013.scope: Consumed 14.316s CPU time.
Dec  5 07:02:06 np0005546909 systemd-machined[153543]: Machine qemu-19-instance-00000013 terminated.
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.843 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3adc6c6e-062e-451f-97e1-1701dfdb1912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.845 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7900d64e-32e1-461e-869a-b2769e8cff9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:06 np0005546909 podman[217958]: 2025-12-05 12:02:06.865894479 +0000 UTC m=+0.057417101 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.880 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8967182b-6b1a-464b-b7e7-4a596e691a63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.898 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[932b0107-b235-4d9d-809b-f721e0b79ee7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 19, 'rx_bytes': 868, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 19, 'rx_bytes': 868, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217987, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.914 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1819c85a-6957-4d19-8c07-fde69cc01d04]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217988, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217988, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.917 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.925 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.926 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.926 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.927 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.927 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:06 np0005546909 kernel: tap47612a1a-e4: entered promiscuous mode
Dec  5 07:02:06 np0005546909 kernel: tap47612a1a-e4 (unregistering): left promiscuous mode
Dec  5 07:02:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:06Z|00160|binding|INFO|Claiming lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e for this chassis.
Dec  5 07:02:06 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.974 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:06Z|00161|binding|INFO|47612a1a-e470-434b-927c-8fcd6c2fbe4e: Claiming fa:16:3e:45:e2:12 10.100.0.10
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.992 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:e2:12 10.100.0.10'], port_security=['fa:16:3e:45:e2:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd95c0324-d1d3-4960-9ab7-3a2a098a9f7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=47612a1a-e470-434b-927c-8fcd6c2fbe4e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.993 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 47612a1a-e470-434b-927c-8fcd6c2fbe4e in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 bound to our chassis#033[00m
Dec  5 07:02:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:06Z|00162|binding|INFO|Setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e ovn-installed in OVS
Dec  5 07:02:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:06Z|00163|binding|INFO|Setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e up in Southbound
Dec  5 07:02:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:06.996 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:06.997 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:06Z|00164|binding|INFO|Releasing lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e from this chassis (sb_readonly=1)
Dec  5 07:02:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:06Z|00165|binding|INFO|Removing iface tap47612a1a-e4 ovn-installed in OVS
Dec  5 07:02:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:06Z|00166|if_status|INFO|Not setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e down as sb is readonly
Dec  5 07:02:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:06Z|00167|binding|INFO|Releasing lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e from this chassis (sb_readonly=0)
Dec  5 07:02:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:07Z|00168|binding|INFO|Setting lport 47612a1a-e470-434b-927c-8fcd6c2fbe4e down in Southbound
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.001 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.010 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:e2:12 10.100.0.10'], port_security=['fa:16:3e:45:e2:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd95c0324-d1d3-4960-9ab7-3a2a098a9f7c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=47612a1a-e470-434b-927c-8fcd6c2fbe4e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.011 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.012 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7c4ad23d-aa2a-4e83-973b-a6626cf3fe94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.016 187212 INFO nova.virt.libvirt.driver [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Instance destroyed successfully.#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.017 187212 DEBUG nova.objects.instance [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'resources' on Instance uuid d95c0324-d1d3-4960-9ab7-3a2a098a9f7c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.029 187212 DEBUG nova.virt.libvirt.vif [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:00:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1974624987',display_name='tempest-ServersAdminTestJSON-server-1974624987',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1974624987',id=19,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-wb0rav5h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:00:49Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=d95c0324-d1d3-4960-9ab7-3a2a098a9f7c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.030 187212 DEBUG nova.network.os_vif_util [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "address": "fa:16:3e:45:e2:12", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47612a1a-e4", "ovs_interfaceid": "47612a1a-e470-434b-927c-8fcd6c2fbe4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.031 187212 DEBUG nova.network.os_vif_util [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.031 187212 DEBUG os_vif [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.033 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.033 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47612a1a-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.035 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.036 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.038 187212 INFO os_vif [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:e2:12,bridge_name='br-int',has_traffic_filtering=True,id=47612a1a-e470-434b-927c-8fcd6c2fbe4e,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47612a1a-e4')#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.038 187212 INFO nova.virt.libvirt.driver [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Deleting instance files /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c_del#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.039 187212 INFO nova.virt.libvirt.driver [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Deletion of /var/lib/nova/instances/d95c0324-d1d3-4960-9ab7-3a2a098a9f7c_del complete#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.041 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[35c39737-4674-4c68-9b56-15e74547af12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.043 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[148a7f0b-53fa-4ac0-b16c-5e86b2833247]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.066 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[47f890c4-6f8d-4e62-b357-ae3d8565273f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.082 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[838fc71a-66f5-4b5c-8379-c6a59a7f0c48]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 21, 'rx_bytes': 868, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 21, 'rx_bytes': 868, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218011, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.088 187212 INFO nova.compute.manager [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.088 187212 DEBUG oslo.service.loopingcall [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.088 187212 DEBUG nova.compute.manager [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.089 187212 DEBUG nova.network.neutron [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.099 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35d79243-1fd6-4782-913a-f34079f91440]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218012, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218012, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.101 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.103 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.105 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.105 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.106 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.106 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.106 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.108 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 47612a1a-e470-434b-927c-8fcd6c2fbe4e in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.110 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.126 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0bc20ca5-11b6-4a4d-a0ce-5542541c965a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.155 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bb85ea67-681a-46f2-a721-3beccb6d8738]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.157 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ae3ba55b-9b66-4d7a-acaa-2b0a2ace2486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.189 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea8872d-1631-48d1-9bbb-a77f1f053818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.208 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bec57974-7caa-4847-954a-0d69b7536807]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 23, 'rx_bytes': 868, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 23, 'rx_bytes': 868, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218018, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.225 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8f28524e-9d87-436e-9011-c39be756601f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218019, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218019, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.226 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.228 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.229 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.230 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.230 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.230 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:07.230 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.357 187212 DEBUG nova.network.neutron [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Updated VIF entry in instance network info cache for port f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.358 187212 DEBUG nova.network.neutron [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Updating instance_info_cache with network_info: [{"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.388 187212 DEBUG oslo_concurrency.lockutils [req-fc359d6d-5c96-4a5d-90b2-ad93b1221d46 req-5fe03d7e-09bc-48d6-8cf2-ecd199e1297b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-795a269a-5af9-4e6a-bf1f-e2bb83634855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.482 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.483 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.483 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.483 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.484 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.484 187212 WARNING nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-plugged-02d6eab5-4561-4d9f-ad9a-169b57667224 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.484 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-unplugged-0d74b914-0dbd-4356-8304-a42943811e2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.484 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.484 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.485 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.485 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-unplugged-0d74b914-0dbd-4356-8304-a42943811e2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.485 187212 WARNING nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-unplugged-0d74b914-0dbd-4356-8304-a42943811e2e for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.485 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.486 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.486 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.486 187212 DEBUG oslo_concurrency.lockutils [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7b8cf31f-430b-4c7f-9c33-7d0cadd44d31-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.486 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] No waiting events found dispatching network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.486 187212 WARNING nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received unexpected event network-vif-plugged-0d74b914-0dbd-4356-8304-a42943811e2e for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.487 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-deleted-02d6eab5-4561-4d9f-ad9a-169b57667224 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.487 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Received event network-vif-deleted-0d74b914-0dbd-4356-8304-a42943811e2e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.487 187212 DEBUG nova.compute.manager [req-cfde4011-6de5-406c-9eec-e19884cf7d4c req-4b3e9c59-a100-4933-bcde-b8660ef82e30 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received event network-vif-deleted-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.510 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.609 187212 DEBUG nova.network.neutron [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.626 187212 INFO nova.compute.manager [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Took 0.54 seconds to deallocate network for instance.#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.673 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.673 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.837 187212 DEBUG nova.compute.provider_tree [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.854 187212 DEBUG nova.scheduler.client.report [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.873 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.903 187212 INFO nova.scheduler.client.report [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Deleted allocations for instance d95c0324-d1d3-4960-9ab7-3a2a098a9f7c#033[00m
Dec  5 07:02:07 np0005546909 nova_compute[187208]: 2025-12-05 12:02:07.959 187212 DEBUG oslo_concurrency.lockutils [None req-02688b2a-c1ee-46b6-8ca4-3b2d9a67156b 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.102 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.102 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.102 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.103 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.103 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Processing event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.103 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.103 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.104 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.104 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.104 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] No waiting events found dispatching network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.104 187212 WARNING nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received unexpected event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.105 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.105 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.105 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.105 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.106 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Processing event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.106 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.106 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.106 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.107 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.107 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] No waiting events found dispatching network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.107 187212 WARNING nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received unexpected event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.107 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received event network-vif-unplugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.108 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.108 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.108 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.108 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] No waiting events found dispatching network-vif-unplugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.109 187212 WARNING nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received unexpected event network-vif-unplugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.109 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.109 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1282e776-5758-493b-8f52-59839ebcd31b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.109 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.110 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1282e776-5758-493b-8f52-59839ebcd31b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.110 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] No waiting events found dispatching network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.110 187212 WARNING nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Received unexpected event network-vif-plugged-9bb4b8ce-5722-4698-aa3d-6d891ab14b0d for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.110 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.111 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.111 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.111 187212 DEBUG oslo_concurrency.lockutils [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.111 187212 DEBUG nova.compute.manager [req-fac8788e-4500-4a9d-a4e2-bf47f8472302 req-6ccc655f-f855-45d9-8933-1064c9b8db4d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Processing event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.112 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Instance event wait completed in 21 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.113 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Instance event wait completed in 15 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.114 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.119 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.121 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936128.1202078, adc15883-b705-42dd-ac95-04f4b8964012 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.121 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.126 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.129 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.134 187212 INFO nova.virt.libvirt.driver [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Instance spawned successfully.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.135 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.137 187212 INFO nova.virt.libvirt.driver [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Instance spawned successfully.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.137 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.142 187212 INFO nova.virt.libvirt.driver [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Instance spawned successfully.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.143 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.155 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.157 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.171 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.172 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.172 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.173 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.173 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.174 187212 DEBUG nova.virt.libvirt.driver [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.179 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.179 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936128.1224675, 1606eea3-5389-4437-b0f9-cfe6084d7871 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.179 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.182 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.182 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.183 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.183 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.183 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.184 187212 DEBUG nova.virt.libvirt.driver [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.195 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.195 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.196 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.196 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.196 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.197 187212 DEBUG nova.virt.libvirt.driver [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.217 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.221 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.262 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.263 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936128.1253827, 795a269a-5af9-4e6a-bf1f-e2bb83634855 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.263 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.309 187212 INFO nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Took 26.85 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.309 187212 DEBUG nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.311 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.317 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.323 187212 INFO nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Took 33.44 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.324 187212 DEBUG nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.353 187212 INFO nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Took 7.65 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.354 187212 DEBUG nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.365 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.435 187212 INFO nova.compute.manager [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Took 28.15 seconds to build instance.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.449 187212 INFO nova.compute.manager [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Took 8.79 seconds to build instance.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.452 187212 INFO nova.compute.manager [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Took 34.17 seconds to build instance.#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.464 187212 DEBUG oslo_concurrency.lockutils [None req-f99b65f4-12f6-496f-812f-2bccd0ac06ea ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 28.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.467 187212 DEBUG oslo_concurrency.lockutils [None req-db21480f-4353-4da4-9e49-acf77a8a5462 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.880s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:08 np0005546909 nova_compute[187208]: 2025-12-05 12:02:08.469 187212 DEBUG oslo_concurrency.lockutils [None req-57dc011f-d809-4790-a940-274f16fa9b3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 34.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.314 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.317 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.318 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.318 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.318 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.320 187212 INFO nova.compute.manager [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Terminating instance#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.321 187212 DEBUG nova.compute.manager [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:02:09 np0005546909 kernel: tap75a214ef-2b (unregistering): left promiscuous mode
Dec  5 07:02:09 np0005546909 NetworkManager[55691]: <info>  [1764936129.5562] device (tap75a214ef-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.571 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:09 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:09Z|00169|binding|INFO|Releasing lport 75a214ef-2b9f-4c81-bdad-de5791244b85 from this chassis (sb_readonly=0)
Dec  5 07:02:09 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:09Z|00170|binding|INFO|Setting lport 75a214ef-2b9f-4c81-bdad-de5791244b85 down in Southbound
Dec  5 07:02:09 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:09Z|00171|binding|INFO|Removing iface tap75a214ef-2b ovn-installed in OVS
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.574 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.585 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:46:fb 10.100.0.5'], port_security=['fa:16:3e:d9:46:fb 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4e7aec76-673e-48b5-b183-cc9c7a95fd37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=75a214ef-2b9f-4c81-bdad-de5791244b85) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.587 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 75a214ef-2b9f-4c81-bdad-de5791244b85 in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.589 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.592 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.612 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d34152de-007b-4f03-b61d-583f1c35232c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:09 np0005546909 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000010.scope: Deactivated successfully.
Dec  5 07:02:09 np0005546909 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000010.scope: Consumed 15.767s CPU time.
Dec  5 07:02:09 np0005546909 systemd-machined[153543]: Machine qemu-16-instance-00000010 terminated.
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.652 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[312c9ef0-f242-475e-ac30-2d3b321a333a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.656 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[661ce4c5-8754-493e-bc0e-d763586b9a9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.697 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[937984b0-b985-45fc-92e9-2b6df9598c6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.724 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[47318d5c-ce5a-4bc3-9eeb-6054615967c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 16, 'tx_packets': 25, 'rx_bytes': 952, 'tx_bytes': 1194, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 16, 'tx_packets': 25, 'rx_bytes': 952, 'tx_bytes': 1194, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218033, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.741 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[429f6506-c206-4a80-8442-3423b7d16282]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218034, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218034, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.743 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.753 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.755 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.755 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.756 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:09.756 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.789 187212 INFO nova.virt.libvirt.driver [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Instance destroyed successfully.#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.790 187212 DEBUG nova.objects.instance [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'resources' on Instance uuid 4e7aec76-673e-48b5-b183-cc9c7a95fd37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.804 187212 DEBUG nova.virt.libvirt.vif [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:00:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-720093205',display_name='tempest-ServersAdminTestJSON-server-720093205',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-720093205',id=16,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-00wbi3mz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:00:26Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=4e7aec76-673e-48b5-b183-cc9c7a95fd37,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.804 187212 DEBUG nova.network.os_vif_util [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "75a214ef-2b9f-4c81-bdad-de5791244b85", "address": "fa:16:3e:d9:46:fb", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75a214ef-2b", "ovs_interfaceid": "75a214ef-2b9f-4c81-bdad-de5791244b85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.805 187212 DEBUG nova.network.os_vif_util [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.805 187212 DEBUG os_vif [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.807 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.807 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75a214ef-2b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.811 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.814 187212 INFO os_vif [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:46:fb,bridge_name='br-int',has_traffic_filtering=True,id=75a214ef-2b9f-4c81-bdad-de5791244b85,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75a214ef-2b')#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.815 187212 INFO nova.virt.libvirt.driver [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Deleting instance files /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37_del#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.815 187212 INFO nova.virt.libvirt.driver [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Deletion of /var/lib/nova/instances/4e7aec76-673e-48b5-b183-cc9c7a95fd37_del complete#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.866 187212 INFO nova.compute.manager [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Took 0.55 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.867 187212 DEBUG oslo.service.loopingcall [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.867 187212 DEBUG nova.compute.manager [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:02:09 np0005546909 nova_compute[187208]: 2025-12-05 12:02:09.867 187212 DEBUG nova.network.neutron [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:02:10 np0005546909 nova_compute[187208]: 2025-12-05 12:02:10.967 187212 DEBUG nova.compute.manager [req-0bad4528-4357-4b90-8150-21eba27b1818 req-80fb344c-cb81-4d22-8d09-cbd9850dbb0c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received event network-vif-deleted-47612a1a-e470-434b-927c-8fcd6c2fbe4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.085 187212 INFO nova.compute.manager [None req-48c50a20-93b6-40c9-880d-24f98e4785f8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Pausing#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.086 187212 DEBUG nova.objects.instance [None req-48c50a20-93b6-40c9-880d-24f98e4785f8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'flavor' on Instance uuid 795a269a-5af9-4e6a-bf1f-e2bb83634855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.116 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936131.1160843, 795a269a-5af9-4e6a-bf1f-e2bb83634855 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.116 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.118 187212 DEBUG nova.compute.manager [None req-48c50a20-93b6-40c9-880d-24f98e4785f8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.161 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.164 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.192 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.371 187212 DEBUG nova.network.neutron [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.399 187212 DEBUG nova.compute.manager [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.400 187212 DEBUG oslo_concurrency.lockutils [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.400 187212 DEBUG oslo_concurrency.lockutils [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.400 187212 DEBUG oslo_concurrency.lockutils [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.400 187212 DEBUG nova.compute.manager [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] No waiting events found dispatching network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.400 187212 WARNING nova.compute.manager [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received unexpected event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa for instance with vm_state paused and task_state None.#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.401 187212 DEBUG nova.compute.manager [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.401 187212 DEBUG oslo_concurrency.lockutils [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.401 187212 DEBUG oslo_concurrency.lockutils [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.401 187212 DEBUG oslo_concurrency.lockutils [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d95c0324-d1d3-4960-9ab7-3a2a098a9f7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.401 187212 DEBUG nova.compute.manager [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] No waiting events found dispatching network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.402 187212 WARNING nova.compute.manager [req-4ae3bcdb-4ee6-455a-bc6e-928bc582bbb6 req-bc6a2605-af14-4251-bd0e-f28ef492eaf8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Received unexpected event network-vif-plugged-47612a1a-e470-434b-927c-8fcd6c2fbe4e for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.403 187212 INFO nova.compute.manager [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Took 1.54 seconds to deallocate network for instance.#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.450 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.450 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.584 187212 DEBUG nova.compute.provider_tree [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.600 187212 DEBUG nova.scheduler.client.report [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.624 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.647 187212 INFO nova.scheduler.client.report [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Deleted allocations for instance 4e7aec76-673e-48b5-b183-cc9c7a95fd37#033[00m
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.720 187212 DEBUG oslo_concurrency.lockutils [None req-c260816c-8d50-4a9c-a895-c5868bbf16fb 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:11Z|00172|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:02:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:11Z|00173|binding|INFO|Releasing lport 8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef from this chassis (sb_readonly=0)
Dec  5 07:02:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:11Z|00174|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec  5 07:02:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:11Z|00175|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec  5 07:02:11 np0005546909 nova_compute[187208]: 2025-12-05 12:02:11.861 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:12 np0005546909 NetworkManager[55691]: <info>  [1764936132.0202] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Dec  5 07:02:12 np0005546909 NetworkManager[55691]: <info>  [1764936132.0218] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Dec  5 07:02:12 np0005546909 nova_compute[187208]: 2025-12-05 12:02:12.019 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:12 np0005546909 nova_compute[187208]: 2025-12-05 12:02:12.201 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:12Z|00176|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:02:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:12Z|00177|binding|INFO|Releasing lport 8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef from this chassis (sb_readonly=0)
Dec  5 07:02:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:12Z|00178|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec  5 07:02:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:12Z|00179|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec  5 07:02:12 np0005546909 nova_compute[187208]: 2025-12-05 12:02:12.237 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:12 np0005546909 nova_compute[187208]: 2025-12-05 12:02:12.512 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.956 187212 DEBUG nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received event network-vif-unplugged-75a214ef-2b9f-4c81-bdad-de5791244b85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.957 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.957 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.957 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.958 187212 DEBUG nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] No waiting events found dispatching network-vif-unplugged-75a214ef-2b9f-4c81-bdad-de5791244b85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.958 187212 WARNING nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received unexpected event network-vif-unplugged-75a214ef-2b9f-4c81-bdad-de5791244b85 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.958 187212 DEBUG nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.959 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.959 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.960 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "4e7aec76-673e-48b5-b183-cc9c7a95fd37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.960 187212 DEBUG nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] No waiting events found dispatching network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.960 187212 WARNING nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received unexpected event network-vif-plugged-75a214ef-2b9f-4c81-bdad-de5791244b85 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.961 187212 DEBUG nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-changed-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.961 187212 DEBUG nova.compute.manager [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Refreshing instance network info cache due to event network-changed-78310fa8-21e8-49e5-8b60-867d1089ad71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.961 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.962 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:13 np0005546909 nova_compute[187208]: 2025-12-05 12:02:13.963 187212 DEBUG nova.network.neutron [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Refreshing network info cache for port 78310fa8-21e8-49e5-8b60-867d1089ad71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.273 187212 DEBUG nova.compute.manager [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.357 187212 INFO nova.compute.manager [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] instance snapshotting#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.358 187212 WARNING nova.compute.manager [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] trying to snapshot a non-running instance: (state: 3 expected: 1)#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.621 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.622 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.622 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.623 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.623 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.625 187212 INFO nova.compute.manager [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Terminating instance#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.626 187212 DEBUG nova.compute.manager [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:02:14 np0005546909 kernel: tapf194d74d-a9 (unregistering): left promiscuous mode
Dec  5 07:02:14 np0005546909 NetworkManager[55691]: <info>  [1764936134.6495] device (tapf194d74d-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:02:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:14Z|00180|binding|INFO|Releasing lport f194d74d-a9ec-4838-b35d-8393a2087ec5 from this chassis (sb_readonly=0)
Dec  5 07:02:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:14Z|00181|binding|INFO|Setting lport f194d74d-a9ec-4838-b35d-8393a2087ec5 down in Southbound
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.654 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:14Z|00182|binding|INFO|Removing iface tapf194d74d-a9 ovn-installed in OVS
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.658 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.667 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:fa:14 10.100.0.14'], port_security=['fa:16:3e:d0:fa:14 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f194d74d-a9ec-4838-b35d-8393a2087ec5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.668 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f194d74d-a9ec-4838-b35d-8393a2087ec5 in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis#033[00m
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.671 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.674 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.688 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0e743e2d-af6d-4559-aad9-9cd9df46ecaa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.715 187212 DEBUG nova.compute.manager [req-ba2385d3-df5f-40fb-bf8a-2f5d629cb9d6 req-12d93cdc-81ad-4c47-b3e8-668f1212b4bf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Received event network-vif-deleted-75a214ef-2b9f-4c81-bdad-de5791244b85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:14 np0005546909 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Dec  5 07:02:14 np0005546909 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000000d.scope: Consumed 17.877s CPU time.
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.722 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e41f79d7-812e-4e64-9441-d9ef047a9288]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:14 np0005546909 systemd-machined[153543]: Machine qemu-14-instance-0000000d terminated.
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.726 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[98e29474-79a7-4444-b672-0f700935d583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.751 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d5fda0a3-4aae-4ae8-add4-0520202cecd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.765 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f056fd39-0db5-406e-91ff-775b27adfed1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24c61e5e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:ed:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 16, 'tx_packets': 27, 'rx_bytes': 952, 'tx_bytes': 1278, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 16, 'tx_packets': 27, 'rx_bytes': 952, 'tx_bytes': 1278, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338518, 'reachable_time': 30383, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218066, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.780 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e40e1b0c-2ae7-4fa6-bd96-53f6d52f203d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338532, 'tstamp': 338532}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218067, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24c61e5e-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 338535, 'tstamp': 338535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218067, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.782 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.783 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.787 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.787 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24c61e5e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.788 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.788 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24c61e5e-70, col_values=(('external_ids', {'iface-id': '1f09e8e7-18eb-4523-a8bb-10fee2270a91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:14.788 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.837 187212 INFO nova.virt.libvirt.driver [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Beginning live snapshot process#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.846 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.892 187212 INFO nova.virt.libvirt.driver [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Instance destroyed successfully.#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.892 187212 DEBUG nova.objects.instance [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'resources' on Instance uuid 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.912 187212 DEBUG nova.virt.libvirt.vif [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T11:59:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1562123791',display_name='tempest-ServersAdminTestJSON-server-1562123791',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1562123791',id=13,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:00:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-vj86fqlt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:00:11Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.913 187212 DEBUG nova.network.os_vif_util [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "address": "fa:16:3e:d0:fa:14", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf194d74d-a9", "ovs_interfaceid": "f194d74d-a9ec-4838-b35d-8393a2087ec5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.914 187212 DEBUG nova.network.os_vif_util [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.914 187212 DEBUG os_vif [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.915 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.916 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf194d74d-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.919 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.921 187212 INFO os_vif [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:fa:14,bridge_name='br-int',has_traffic_filtering=True,id=f194d74d-a9ec-4838-b35d-8393a2087ec5,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf194d74d-a9')#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.922 187212 INFO nova.virt.libvirt.driver [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Deleting instance files /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa_del#033[00m
Dec  5 07:02:14 np0005546909 nova_compute[187208]: 2025-12-05 12:02:14.923 187212 INFO nova.virt.libvirt.driver [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Deletion of /var/lib/nova/instances/3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa_del complete#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.007 187212 INFO nova.compute.manager [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.007 187212 DEBUG oslo.service.loopingcall [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.008 187212 DEBUG nova.compute.manager [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.008 187212 DEBUG nova.network.neutron [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:02:15 np0005546909 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.019 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.082 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.083 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.137 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855/disk --force-share --output=json -f qcow2" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.153 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.225 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.227 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp5fozw60p/be341bb402074a5e93ccb5918f96c113.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:15 np0005546909 podman[218089]: 2025-12-05 12:02:15.234571503 +0000 UTC m=+0.079499232 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.267 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp5fozw60p/be341bb402074a5e93ccb5918f96c113.delta 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.269 187212 INFO nova.virt.libvirt.driver [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Dec  5 07:02:15 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.321 187212 DEBUG nova.virt.libvirt.guest [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.324 187212 INFO nova.virt.libvirt.driver [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.365 187212 DEBUG nova.privsep.utils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.365 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp5fozw60p/be341bb402074a5e93ccb5918f96c113.delta /var/lib/nova/instances/snapshots/tmp5fozw60p/be341bb402074a5e93ccb5918f96c113 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.521 187212 DEBUG oslo_concurrency.processutils [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp5fozw60p/be341bb402074a5e93ccb5918f96c113.delta /var/lib/nova/instances/snapshots/tmp5fozw60p/be341bb402074a5e93ccb5918f96c113" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.523 187212 INFO nova.virt.libvirt.driver [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.752 187212 DEBUG nova.network.neutron [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.772 187212 INFO nova.compute.manager [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Took 0.76 seconds to deallocate network for instance.#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.815 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.816 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.939 187212 DEBUG nova.network.neutron [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Updated VIF entry in instance network info cache for port 78310fa8-21e8-49e5-8b60-867d1089ad71. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.940 187212 DEBUG nova.network.neutron [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Updating instance_info_cache with network_info: [{"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.965 187212 DEBUG oslo_concurrency.lockutils [req-3bee9f74-54d6-4708-a883-1388d4cd6459 req-297667f5-d726-49d9-b798-0eb9a1600d89 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-adc15883-b705-42dd-ac95-04f4b8964012" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.969 187212 DEBUG nova.compute.provider_tree [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:15 np0005546909 nova_compute[187208]: 2025-12-05 12:02:15.991 187212 DEBUG nova.scheduler.client.report [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:16 np0005546909 nova_compute[187208]: 2025-12-05 12:02:16.019 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:16 np0005546909 nova_compute[187208]: 2025-12-05 12:02:16.053 187212 INFO nova.scheduler.client.report [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Deleted allocations for instance 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa#033[00m
Dec  5 07:02:16 np0005546909 nova_compute[187208]: 2025-12-05 12:02:16.181 187212 DEBUG oslo_concurrency.lockutils [None req-511f2324-62ef-4a47-aa84-0c0afbf456ac 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:16 np0005546909 nova_compute[187208]: 2025-12-05 12:02:16.420 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936121.4190943, 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:16 np0005546909 nova_compute[187208]: 2025-12-05 12:02:16.421 187212 INFO nova.compute.manager [-] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:02:16 np0005546909 nova_compute[187208]: 2025-12-05 12:02:16.894 187212 DEBUG nova.compute.manager [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-changed-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:16 np0005546909 nova_compute[187208]: 2025-12-05 12:02:16.895 187212 DEBUG nova.compute.manager [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Refreshing instance network info cache due to event network-changed-c72089e0-4937-40b6-86b5-f9d6d0982058. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:02:16 np0005546909 nova_compute[187208]: 2025-12-05 12:02:16.895 187212 DEBUG oslo_concurrency.lockutils [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:16 np0005546909 nova_compute[187208]: 2025-12-05 12:02:16.896 187212 DEBUG oslo_concurrency.lockutils [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:16 np0005546909 nova_compute[187208]: 2025-12-05 12:02:16.896 187212 DEBUG nova.network.neutron [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Refreshing network info cache for port c72089e0-4937-40b6-86b5-f9d6d0982058 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:02:16 np0005546909 nova_compute[187208]: 2025-12-05 12:02:16.901 187212 DEBUG nova.compute.manager [None req-a64b7af6-b7a9-43e6-a455-408fe8ec746e - - - - - -] [instance: 7b8cf31f-430b-4c7f-9c33-7d0cadd44d31] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.880 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.881 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.881 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.881 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.881 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.883 187212 INFO nova.compute.manager [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Terminating instance#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.884 187212 DEBUG nova.compute.manager [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:02:17 np0005546909 kernel: tap380c99a7-94 (unregistering): left promiscuous mode
Dec  5 07:02:17 np0005546909 NetworkManager[55691]: <info>  [1764936137.9189] device (tap380c99a7-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:02:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:17Z|00183|binding|INFO|Releasing lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d from this chassis (sb_readonly=0)
Dec  5 07:02:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:17Z|00184|binding|INFO|Setting lport 380c99a7-9480-45f8-b2f4-adfcdfa8576d down in Southbound
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.921 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:17Z|00185|binding|INFO|Removing iface tap380c99a7-94 ovn-installed in OVS
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.926 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:17.929 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:4f:38 10.100.0.13'], port_security=['fa:16:3e:24:4f:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '982a8e69-5181-4847-bdfe-8d4de12bb2e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98815fe6b9ea4988abc2cccd9726dc86', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c37a488d-bf45-4dbe-bc9b-282a5e2aeaa2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c85a3007-c1d8-410f-afa2-138dae32aa49, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=380c99a7-9480-45f8-b2f4-adfcdfa8576d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:17.930 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 380c99a7-9480-45f8-b2f4-adfcdfa8576d in datapath 24c61e5e-7d15-4019-b1bd-d2e253f41aa5 unbound from our chassis#033[00m
Dec  5 07:02:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:17.933 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24c61e5e-7d15-4019-b1bd-d2e253f41aa5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:02:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:17Z|00186|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:02:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:17Z|00187|binding|INFO|Releasing lport 8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef from this chassis (sb_readonly=0)
Dec  5 07:02:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:17Z|00188|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec  5 07:02:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:17Z|00189|binding|INFO|Releasing lport 1f09e8e7-18eb-4523-a8bb-10fee2270a91 from this chassis (sb_readonly=0)
Dec  5 07:02:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:17.935 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[539d8bd8-564c-495e-a041-a9eae5874dd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:17.936 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5 namespace which is not needed anymore#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.946 187212 DEBUG nova.compute.manager [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received event network-vif-unplugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.946 187212 DEBUG oslo_concurrency.lockutils [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.946 187212 DEBUG oslo_concurrency.lockutils [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.947 187212 DEBUG oslo_concurrency.lockutils [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.947 187212 DEBUG nova.compute.manager [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] No waiting events found dispatching network-vif-unplugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.947 187212 WARNING nova.compute.manager [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received unexpected event network-vif-unplugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.947 187212 DEBUG nova.compute.manager [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.947 187212 DEBUG oslo_concurrency.lockutils [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.948 187212 DEBUG oslo_concurrency.lockutils [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.948 187212 DEBUG oslo_concurrency.lockutils [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.948 187212 DEBUG nova.compute.manager [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] No waiting events found dispatching network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.948 187212 WARNING nova.compute.manager [req-80c666c3-1574-4208-9ecd-5f3cc0981af6 req-1d599008-67ad-4881-9416-67f2f816d44a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received unexpected event network-vif-plugged-f194d74d-a9ec-4838-b35d-8393a2087ec5 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:17 np0005546909 nova_compute[187208]: 2025-12-05 12:02:17.963 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:17 np0005546909 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec  5 07:02:17 np0005546909 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000000c.scope: Consumed 12.993s CPU time.
Dec  5 07:02:17 np0005546909 systemd-machined[153543]: Machine qemu-26-instance-0000000c terminated.
Dec  5 07:02:18 np0005546909 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [NOTICE]   (214873) : haproxy version is 2.8.14-c23fe91
Dec  5 07:02:18 np0005546909 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [NOTICE]   (214873) : path to executable is /usr/sbin/haproxy
Dec  5 07:02:18 np0005546909 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [WARNING]  (214873) : Exiting Master process...
Dec  5 07:02:18 np0005546909 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [ALERT]    (214873) : Current worker (214875) exited with code 143 (Terminated)
Dec  5 07:02:18 np0005546909 neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5[214869]: [WARNING]  (214873) : All workers exited. Exiting... (0)
Dec  5 07:02:18 np0005546909 systemd[1]: libpod-37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0.scope: Deactivated successfully.
Dec  5 07:02:18 np0005546909 podman[218154]: 2025-12-05 12:02:18.080876252 +0000 UTC m=+0.050341929 container died 37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 07:02:18 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0-userdata-shm.mount: Deactivated successfully.
Dec  5 07:02:18 np0005546909 systemd[1]: var-lib-containers-storage-overlay-38bc886dbbbea2769a63b04a9e8180064790337d80822dc7c2e5b30fc62aed96-merged.mount: Deactivated successfully.
Dec  5 07:02:18 np0005546909 podman[218154]: 2025-12-05 12:02:18.12525911 +0000 UTC m=+0.094724787 container cleanup 37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:02:18 np0005546909 systemd[1]: libpod-conmon-37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0.scope: Deactivated successfully.
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.153 187212 INFO nova.virt.libvirt.driver [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Instance destroyed successfully.#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.153 187212 DEBUG nova.objects.instance [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lazy-loading 'resources' on Instance uuid 982a8e69-5181-4847-bdfe-8d4de12bb2e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.171 187212 DEBUG nova.virt.libvirt.vif [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T11:59:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1785289561',display_name='tempest-ServersAdminTestJSON-server-1785289561',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1785289561',id=12,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:01:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98815fe6b9ea4988abc2cccd9726dc86',ramdisk_id='',reservation_id='r-1km5j15v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='2',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-715947304',owner_user_name='tempest-ServersAdminTestJSON-715947304-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:02:02Z,user_data=None,user_id='1ac3c267120a4aeaa91f472943c4e1e2',uuid=982a8e69-5181-4847-bdfe-8d4de12bb2e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.171 187212 DEBUG nova.network.os_vif_util [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converting VIF {"id": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "address": "fa:16:3e:24:4f:38", "network": {"id": "24c61e5e-7d15-4019-b1bd-d2e253f41aa5", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-157392420-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98815fe6b9ea4988abc2cccd9726dc86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap380c99a7-94", "ovs_interfaceid": "380c99a7-9480-45f8-b2f4-adfcdfa8576d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.173 187212 DEBUG nova.network.os_vif_util [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.173 187212 DEBUG os_vif [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.177 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.178 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap380c99a7-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:18 np0005546909 podman[218195]: 2025-12-05 12:02:18.187099996 +0000 UTC m=+0.039184010 container remove 37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.238 187212 INFO nova.virt.libvirt.driver [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Snapshot image upload complete#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.240 187212 INFO nova.compute.manager [None req-6d9ca9c9-ae27-4fde-9a41-9452cbbe33e0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Took 3.88 seconds to snapshot the instance on the hypervisor.#033[00m
Dec  5 07:02:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.244 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4718333e-7696-4cba-899c-714e671a7815]: (4, ('Fri Dec  5 12:02:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5 (37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0)\n37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0\nFri Dec  5 12:02:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5 (37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0)\n37bc3e9b31c96212769e3f7d2200429a42066b45bd2895358236538e1affa1b0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.245 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.245 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef1e967-85d3-4feb-a83b-6dbc3db2f8fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.246 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24c61e5e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.247 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:18 np0005546909 kernel: tap24c61e5e-70: left promiscuous mode
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.253 187212 INFO os_vif [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:4f:38,bridge_name='br-int',has_traffic_filtering=True,id=380c99a7-9480-45f8-b2f4-adfcdfa8576d,network=Network(24c61e5e-7d15-4019-b1bd-d2e253f41aa5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap380c99a7-94')#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.254 187212 INFO nova.virt.libvirt.driver [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deleting instance files /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4_del#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.255 187212 INFO nova.virt.libvirt.driver [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deletion of /var/lib/nova/instances/982a8e69-5181-4847-bdfe-8d4de12bb2e4_del complete#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.262 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.266 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[533e6808-ed9f-4809-87de-eb53b1019145]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.278 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c2186617-f7c0-44dd-b931-c119881e167f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.279 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[339a1f8a-5dac-4179-8633-13e121e8db73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.295 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9e8a1e7d-0e4d-4c85-a91f-19463211aa3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 338509, 'reachable_time': 39930, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218211, 'error': None, 'target': 'ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:18 np0005546909 systemd[1]: run-netns-ovnmeta\x2d24c61e5e\x2d7d15\x2d4019\x2db1bd\x2dd2e253f41aa5.mount: Deactivated successfully.
Dec  5 07:02:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.297 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24c61e5e-7d15-4019-b1bd-d2e253f41aa5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:02:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:18.297 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[288960ba-7c22-4631-865a-ad0eac004871]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.313 187212 INFO nova.compute.manager [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.314 187212 DEBUG oslo.service.loopingcall [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.314 187212 DEBUG nova.compute.manager [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.314 187212 DEBUG nova.network.neutron [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.993 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936123.9929156, 1282e776-5758-493b-8f52-59839ebcd31b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:18 np0005546909 nova_compute[187208]: 2025-12-05 12:02:18.994 187212 INFO nova.compute.manager [-] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.013 187212 DEBUG nova.compute.manager [None req-cc4b3881-0770-4a6b-9ac2-777151284987 - - - - - -] [instance: 1282e776-5758-493b-8f52-59839ebcd31b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.022 187212 DEBUG nova.network.neutron [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.044 187212 INFO nova.compute.manager [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Took 0.73 seconds to deallocate network for instance.#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.065 187212 DEBUG nova.network.neutron [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Updated VIF entry in instance network info cache for port c72089e0-4937-40b6-86b5-f9d6d0982058. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.065 187212 DEBUG nova.network.neutron [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Updating instance_info_cache with network_info: [{"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.102 187212 DEBUG oslo_concurrency.lockutils [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-1606eea3-5389-4437-b0f9-cfe6084d7871" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.102 187212 DEBUG nova.compute.manager [req-c0496f4e-ba1c-49c3-90de-a2a7d6b215b2 req-e0cd1e58-019d-402f-83ab-31365c51b774 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Received event network-vif-deleted-f194d74d-a9ec-4838-b35d-8393a2087ec5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.107 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.107 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.233 187212 DEBUG nova.compute.provider_tree [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.261 187212 DEBUG nova.scheduler.client.report [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.292 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.311 187212 INFO nova.scheduler.client.report [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Deleted allocations for instance 982a8e69-5181-4847-bdfe-8d4de12bb2e4#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.375 187212 DEBUG oslo_concurrency.lockutils [None req-55219202-2bba-42ff-8233-ef264795dd3e 1ac3c267120a4aeaa91f472943c4e1e2 98815fe6b9ea4988abc2cccd9726dc86 - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:19 np0005546909 nova_compute[187208]: 2025-12-05 12:02:19.469 187212 DEBUG nova.compute.manager [req-12e67ac1-37e7-4c50-9378-6be1ee976f58 req-11084258-784b-4a9c-9e1c-48e723ce2b14 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-deleted-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:20 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:20Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c8:42:5d 10.100.0.11
Dec  5 07:02:20 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:20Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c8:42:5d 10.100.0.11
Dec  5 07:02:20 np0005546909 nova_compute[187208]: 2025-12-05 12:02:20.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:21 np0005546909 podman[218248]: 2025-12-05 12:02:21.217817872 +0000 UTC m=+0.061981581 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container)
Dec  5 07:02:21 np0005546909 podman[218249]: 2025-12-05 12:02:21.235661382 +0000 UTC m=+0.074495899 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.261 187212 DEBUG nova.compute.manager [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.262 187212 DEBUG oslo_concurrency.lockutils [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.262 187212 DEBUG oslo_concurrency.lockutils [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.263 187212 DEBUG oslo_concurrency.lockutils [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.263 187212 DEBUG nova.compute.manager [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.263 187212 WARNING nova.compute.manager [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-unplugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.264 187212 DEBUG nova.compute.manager [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.264 187212 DEBUG oslo_concurrency.lockutils [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.264 187212 DEBUG oslo_concurrency.lockutils [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.264 187212 DEBUG oslo_concurrency.lockutils [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "982a8e69-5181-4847-bdfe-8d4de12bb2e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.265 187212 DEBUG nova.compute.manager [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] No waiting events found dispatching network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.265 187212 WARNING nova.compute.manager [req-971ed5e8-de07-4c79-b7e5-76c0b21eae7c req-bb764cab-b81a-4b4e-8c27-1d8b4f297895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Received unexpected event network-vif-plugged-380c99a7-9480-45f8-b2f4-adfcdfa8576d for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:21 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:21Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ea:73:d9 10.100.0.11
Dec  5 07:02:21 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:21Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ea:73:d9 10.100.0.11
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.662 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.663 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.663 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.663 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.663 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.665 187212 INFO nova.compute.manager [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Terminating instance#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.666 187212 DEBUG nova.compute.manager [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:02:21 np0005546909 kernel: tapf6fc1ec5-ea (unregistering): left promiscuous mode
Dec  5 07:02:21 np0005546909 NetworkManager[55691]: <info>  [1764936141.6846] device (tapf6fc1ec5-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:21 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:21Z|00190|binding|INFO|Releasing lport f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa from this chassis (sb_readonly=0)
Dec  5 07:02:21 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:21Z|00191|binding|INFO|Setting lport f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa down in Southbound
Dec  5 07:02:21 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:21Z|00192|binding|INFO|Removing iface tapf6fc1ec5-ea ovn-installed in OVS
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.693 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:21.702 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:9d:39 10.100.0.12'], port_security=['fa:16:3e:5f:9d:39 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '795a269a-5af9-4e6a-bf1f-e2bb83634855', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:21.703 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis#033[00m
Dec  5 07:02:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:21.705 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:02:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:21.706 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ff319d56-7013-4f30-b7e9-6014c597bc35]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:21.707 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace which is not needed anymore#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.709 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:21 np0005546909 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Dec  5 07:02:21 np0005546909 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000001b.scope: Consumed 3.363s CPU time.
Dec  5 07:02:21 np0005546909 systemd-machined[153543]: Machine qemu-31-instance-0000001b terminated.
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.926 187212 INFO nova.virt.libvirt.driver [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Instance destroyed successfully.#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.927 187212 DEBUG nova.objects.instance [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'resources' on Instance uuid 795a269a-5af9-4e6a-bf1f-e2bb83634855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.959 187212 DEBUG nova.virt.libvirt.vif [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1602902542',display_name='tempest-ImagesTestJSON-server-1602902542',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1602902542',id=27,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:02:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-5xs2qi13',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:02:18Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=795a269a-5af9-4e6a-bf1f-e2bb83634855,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.960 187212 DEBUG nova.network.os_vif_util [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "address": "fa:16:3e:5f:9d:39", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc1ec5-ea", "ovs_interfaceid": "f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.961 187212 DEBUG nova.network.os_vif_util [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.961 187212 DEBUG os_vif [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.963 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:21 np0005546909 nova_compute[187208]: 2025-12-05 12:02:21.964 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6fc1ec5-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.005 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.008 187212 INFO os_vif [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:9d:39,bridge_name='br-int',has_traffic_filtering=True,id=f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc1ec5-ea')#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.009 187212 INFO nova.virt.libvirt.driver [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Deleting instance files /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855_del#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.010 187212 INFO nova.virt.libvirt.driver [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Deletion of /var/lib/nova/instances/795a269a-5af9-4e6a-bf1f-e2bb83634855_del complete#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.014 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936127.013532, d95c0324-d1d3-4960-9ab7-3a2a098a9f7c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.014 187212 INFO nova.compute.manager [-] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.056 187212 DEBUG nova.compute.manager [None req-bf31b66c-d366-4ea7-9bdb-e76be5da22f0 - - - - - -] [instance: d95c0324-d1d3-4960-9ab7-3a2a098a9f7c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.068 187212 INFO nova.compute.manager [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.068 187212 DEBUG oslo.service.loopingcall [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.069 187212 DEBUG nova.compute.manager [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.069 187212 DEBUG nova.network.neutron [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:02:22 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [NOTICE]   (217946) : haproxy version is 2.8.14-c23fe91
Dec  5 07:02:22 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [NOTICE]   (217946) : path to executable is /usr/sbin/haproxy
Dec  5 07:02:22 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [WARNING]  (217946) : Exiting Master process...
Dec  5 07:02:22 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [WARNING]  (217946) : Exiting Master process...
Dec  5 07:02:22 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [ALERT]    (217946) : Current worker (217948) exited with code 143 (Terminated)
Dec  5 07:02:22 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[217942]: [WARNING]  (217946) : All workers exited. Exiting... (0)
Dec  5 07:02:22 np0005546909 systemd[1]: libpod-946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3.scope: Deactivated successfully.
Dec  5 07:02:22 np0005546909 podman[218312]: 2025-12-05 12:02:22.181255342 +0000 UTC m=+0.378509333 container died 946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec  5 07:02:22 np0005546909 systemd[1]: var-lib-containers-storage-overlay-55714bde8320c7ed878ac9913f0bc25c165a8c6293c219aeaa49515054bdec12-merged.mount: Deactivated successfully.
Dec  5 07:02:22 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3-userdata-shm.mount: Deactivated successfully.
Dec  5 07:02:22 np0005546909 podman[218312]: 2025-12-05 12:02:22.227109341 +0000 UTC m=+0.424363322 container cleanup 946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  5 07:02:22 np0005546909 systemd[1]: libpod-conmon-946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3.scope: Deactivated successfully.
Dec  5 07:02:22 np0005546909 podman[218356]: 2025-12-05 12:02:22.296767581 +0000 UTC m=+0.048033853 container remove 946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  5 07:02:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.301 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9973b4f4-26ee-413f-b875-4d6e4b61fc69]: (4, ('Fri Dec  5 12:02:21 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3)\n946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3\nFri Dec  5 12:02:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3)\n946912ad829d4d46a7fb313c1dfee6bfb456e5b12833ddd431a0681ef6f3d0f3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.307 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cb84d507-e922-4d99-b274-02e3f5739354]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.309 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.311 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:22 np0005546909 kernel: tap41b3b495-c0: left promiscuous mode
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.317 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[85de475a-f15e-4455-ad81-6eaaed22bff3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.326 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.340 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ac3de163-e614-4fc6-a83f-88170d2676e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.342 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e91589d1-2f82-4123-8133-f6ec6a6bfd80]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.357 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3c73c474-df8a-48e1-9e14-2ecdb1ede8d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350633, 'reachable_time': 28597, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218372, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:22 np0005546909 systemd[1]: run-netns-ovnmeta\x2d41b3b495\x2dc1c9\x2d44c0\x2db1a3\x2da499df6548dd.mount: Deactivated successfully.
Dec  5 07:02:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.359 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:02:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:22.359 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[092bff9e-c70c-4988-8719-ed41f00a5c7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.516 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.671 187212 DEBUG nova.network.neutron [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.694 187212 INFO nova.compute.manager [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Took 0.62 seconds to deallocate network for instance.#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.747 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.748 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.838 187212 DEBUG nova.compute.provider_tree [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.853 187212 DEBUG nova.scheduler.client.report [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.877 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:22 np0005546909 nova_compute[187208]: 2025-12-05 12:02:22.930 187212 INFO nova.scheduler.client.report [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Deleted allocations for instance 795a269a-5af9-4e6a-bf1f-e2bb83634855#033[00m
Dec  5 07:02:23 np0005546909 nova_compute[187208]: 2025-12-05 12:02:23.008 187212 DEBUG oslo_concurrency.lockutils [None req-a8530f86-a9ac-4cf6-a8ba-25782c769a16 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:23 np0005546909 nova_compute[187208]: 2025-12-05 12:02:23.014 187212 DEBUG nova.compute.manager [req-9a764540-51d3-42c2-b7b6-cb8c0f7495a1 req-2541d075-80b0-47a1-99dd-b4b47ad87f0b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received event network-vif-deleted-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:24 np0005546909 nova_compute[187208]: 2025-12-05 12:02:24.560 187212 DEBUG nova.compute.manager [req-c08a92f7-9e1f-4e29-9425-e05608e41437 req-5d528765-ac41-4329-95d1-e1c93145f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received event network-vif-unplugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:24 np0005546909 nova_compute[187208]: 2025-12-05 12:02:24.561 187212 DEBUG oslo_concurrency.lockutils [req-c08a92f7-9e1f-4e29-9425-e05608e41437 req-5d528765-ac41-4329-95d1-e1c93145f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:24 np0005546909 nova_compute[187208]: 2025-12-05 12:02:24.561 187212 DEBUG oslo_concurrency.lockutils [req-c08a92f7-9e1f-4e29-9425-e05608e41437 req-5d528765-ac41-4329-95d1-e1c93145f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:24 np0005546909 nova_compute[187208]: 2025-12-05 12:02:24.561 187212 DEBUG oslo_concurrency.lockutils [req-c08a92f7-9e1f-4e29-9425-e05608e41437 req-5d528765-ac41-4329-95d1-e1c93145f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:24 np0005546909 nova_compute[187208]: 2025-12-05 12:02:24.562 187212 DEBUG nova.compute.manager [req-c08a92f7-9e1f-4e29-9425-e05608e41437 req-5d528765-ac41-4329-95d1-e1c93145f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] No waiting events found dispatching network-vif-unplugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:24 np0005546909 nova_compute[187208]: 2025-12-05 12:02:24.562 187212 WARNING nova.compute.manager [req-c08a92f7-9e1f-4e29-9425-e05608e41437 req-5d528765-ac41-4329-95d1-e1c93145f325 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received unexpected event network-vif-unplugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:24 np0005546909 nova_compute[187208]: 2025-12-05 12:02:24.782 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936129.7816968, 4e7aec76-673e-48b5-b183-cc9c7a95fd37 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:24 np0005546909 nova_compute[187208]: 2025-12-05 12:02:24.783 187212 INFO nova.compute.manager [-] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:02:24 np0005546909 nova_compute[187208]: 2025-12-05 12:02:24.807 187212 DEBUG nova.compute.manager [None req-4e58248a-0e7a-46f1-94b7-9a7ddbdf5bb7 - - - - - -] [instance: 4e7aec76-673e-48b5-b183-cc9c7a95fd37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:25Z|00193|binding|INFO|Releasing lport 8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef from this chassis (sb_readonly=0)
Dec  5 07:02:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:25Z|00194|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.008 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.676 187212 DEBUG nova.compute.manager [req-5b8f3176-e3c0-4671-9a8d-6e217d11a845 req-f5786776-6c71-42d1-819f-c6bcaa13ca71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.676 187212 DEBUG oslo_concurrency.lockutils [req-5b8f3176-e3c0-4671-9a8d-6e217d11a845 req-f5786776-6c71-42d1-819f-c6bcaa13ca71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.677 187212 DEBUG oslo_concurrency.lockutils [req-5b8f3176-e3c0-4671-9a8d-6e217d11a845 req-f5786776-6c71-42d1-819f-c6bcaa13ca71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.677 187212 DEBUG oslo_concurrency.lockutils [req-5b8f3176-e3c0-4671-9a8d-6e217d11a845 req-f5786776-6c71-42d1-819f-c6bcaa13ca71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "795a269a-5af9-4e6a-bf1f-e2bb83634855-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.677 187212 DEBUG nova.compute.manager [req-5b8f3176-e3c0-4671-9a8d-6e217d11a845 req-f5786776-6c71-42d1-819f-c6bcaa13ca71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] No waiting events found dispatching network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.677 187212 WARNING nova.compute.manager [req-5b8f3176-e3c0-4671-9a8d-6e217d11a845 req-f5786776-6c71-42d1-819f-c6bcaa13ca71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Received unexpected event network-vif-plugged-f6fc1ec5-ea3c-43fe-bd7d-2d5f532120aa for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.778 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.779 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.798 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.867 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.867 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.873 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:02:26 np0005546909 nova_compute[187208]: 2025-12-05 12:02:26.873 187212 INFO nova.compute.claims [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.029 187212 DEBUG nova.compute.provider_tree [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.045 187212 DEBUG nova.scheduler.client.report [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.071 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.072 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.090 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:27 np0005546909 podman[218375]: 2025-12-05 12:02:27.205378296 +0000 UTC m=+0.056969488 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.229 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.229 187212 DEBUG nova.network.neutron [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:02:27 np0005546909 podman[218376]: 2025-12-05 12:02:27.234141168 +0000 UTC m=+0.084636739 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.244 187212 INFO nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.260 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.346 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.347 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.347 187212 INFO nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Creating image(s)#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.348 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "/var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.348 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.349 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.360 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.416 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.417 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.417 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.428 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.481 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.482 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.516 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.517 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.518 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.534 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.575 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.576 187212 DEBUG nova.virt.disk.api [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Checking if we can resize image /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.576 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.636 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.637 187212 DEBUG nova.virt.disk.api [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Cannot resize image /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.637 187212 DEBUG nova.objects.instance [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'migration_context' on Instance uuid 9efa988a-19ae-440a-8a56-0bac68cb3c9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.653 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.654 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Ensure instance console log exists: /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.654 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.655 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.655 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:27 np0005546909 nova_compute[187208]: 2025-12-05 12:02:27.782 187212 DEBUG nova.policy [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:02:29 np0005546909 nova_compute[187208]: 2025-12-05 12:02:29.604 187212 DEBUG nova.network.neutron [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Successfully created port: 82089bf4-207e-4880-b8ff-9bf09a4ac3fb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:02:29 np0005546909 nova_compute[187208]: 2025-12-05 12:02:29.891 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936134.8898897, 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:29 np0005546909 nova_compute[187208]: 2025-12-05 12:02:29.891 187212 INFO nova.compute.manager [-] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:02:29 np0005546909 nova_compute[187208]: 2025-12-05 12:02:29.913 187212 DEBUG nova.compute.manager [None req-b9aadafc-d44f-46b1-a0a7-e63dece5f2aa - - - - - -] [instance: 3b55ef3a-7a32-4e13-9ad3-2a53ae73a6aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:30 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:30Z|00195|binding|INFO|Releasing lport 8e60b4fb-312d-4ef3-8d65-1f9d4ef1d4ef from this chassis (sb_readonly=0)
Dec  5 07:02:30 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:30Z|00196|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.100 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.101 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.101 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.101 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.102 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.103 187212 INFO nova.compute.manager [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Terminating instance#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.104 187212 DEBUG nova.compute.manager [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.104 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:30 np0005546909 kernel: tap78310fa8-21 (unregistering): left promiscuous mode
Dec  5 07:02:30 np0005546909 NetworkManager[55691]: <info>  [1764936150.1340] device (tap78310fa8-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.144 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:30 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:30Z|00197|binding|INFO|Releasing lport 78310fa8-21e8-49e5-8b60-867d1089ad71 from this chassis (sb_readonly=0)
Dec  5 07:02:30 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:30Z|00198|binding|INFO|Setting lport 78310fa8-21e8-49e5-8b60-867d1089ad71 down in Southbound
Dec  5 07:02:30 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:30Z|00199|binding|INFO|Removing iface tap78310fa8-21 ovn-installed in OVS
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.148 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.153 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:42:5d 10.100.0.11'], port_security=['fa:16:3e:c8:42:5d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'adc15883-b705-42dd-ac95-04f4b8964012', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '342e6d694cf6482c9f1b7557a17bce60', 'neutron:revision_number': '4', 'neutron:security_group_ids': '710ea28e-d1ba-4c63-a751-16b460b2129b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a85cd729-c72e-4d3c-b444-ff0b42d436ff, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=78310fa8-21e8-49e5-8b60-867d1089ad71) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.155 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 78310fa8-21e8-49e5-8b60-867d1089ad71 in datapath 393d33f9-2dde-4fb5-b5db-3f0fb98d4637 unbound from our chassis#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.157 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 393d33f9-2dde-4fb5-b5db-3f0fb98d4637, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.158 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0c8f5b3a-f6cb-4a79-b07c-b020e01db895]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.158 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 namespace which is not needed anymore#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.159 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:30 np0005546909 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000019.scope: Deactivated successfully.
Dec  5 07:02:30 np0005546909 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000019.scope: Consumed 13.526s CPU time.
Dec  5 07:02:30 np0005546909 systemd-machined[153543]: Machine qemu-29-instance-00000019 terminated.
Dec  5 07:02:30 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [NOTICE]   (217354) : haproxy version is 2.8.14-c23fe91
Dec  5 07:02:30 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [NOTICE]   (217354) : path to executable is /usr/sbin/haproxy
Dec  5 07:02:30 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [WARNING]  (217354) : Exiting Master process...
Dec  5 07:02:30 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [ALERT]    (217354) : Current worker (217357) exited with code 143 (Terminated)
Dec  5 07:02:30 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[217340]: [WARNING]  (217354) : All workers exited. Exiting... (0)
Dec  5 07:02:30 np0005546909 systemd[1]: libpod-f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751.scope: Deactivated successfully.
Dec  5 07:02:30 np0005546909 podman[218464]: 2025-12-05 12:02:30.298935978 +0000 UTC m=+0.044294746 container died f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.327 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.333 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:30 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751-userdata-shm.mount: Deactivated successfully.
Dec  5 07:02:30 np0005546909 systemd[1]: var-lib-containers-storage-overlay-26091b88884c53d07a76d03c6c9c66adb5d232a7c306c3b78dafe02bf1e95c96-merged.mount: Deactivated successfully.
Dec  5 07:02:30 np0005546909 podman[218464]: 2025-12-05 12:02:30.345164689 +0000 UTC m=+0.090523427 container cleanup f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  5 07:02:30 np0005546909 systemd[1]: libpod-conmon-f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751.scope: Deactivated successfully.
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.381 187212 INFO nova.virt.libvirt.driver [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Instance destroyed successfully.#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.382 187212 DEBUG nova.objects.instance [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lazy-loading 'resources' on Instance uuid adc15883-b705-42dd-ac95-04f4b8964012 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.395 187212 DEBUG nova.virt.libvirt.vif [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-555517467',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-555517467',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(26),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-555517467',id=25,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=26,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHclZt3lDeuFOP8poKE+ML8+DG1Fbw3aUsTnjf0HLJVz5RSbJGx4tv2GGPcCJx4ta3mNRAE5Oj+av9qQ6qgWWoPyu4x9SJdJ+NWU4lkfCG3kIVf4et9X/7mGn0JPIZgI2A==',key_name='tempest-keypair-270659961',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:02:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='342e6d694cf6482c9f1b7557a17bce60',ramdisk_id='',reservation_id='r-hjkfnf9c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:02:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79758a6c7516459bb1907270241d266a',uuid=adc15883-b705-42dd-ac95-04f4b8964012,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.395 187212 DEBUG nova.network.os_vif_util [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converting VIF {"id": "78310fa8-21e8-49e5-8b60-867d1089ad71", "address": "fa:16:3e:c8:42:5d", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap78310fa8-21", "ovs_interfaceid": "78310fa8-21e8-49e5-8b60-867d1089ad71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.396 187212 DEBUG nova.network.os_vif_util [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.397 187212 DEBUG os_vif [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.398 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.398 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap78310fa8-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.400 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.401 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.403 187212 INFO os_vif [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:42:5d,bridge_name='br-int',has_traffic_filtering=True,id=78310fa8-21e8-49e5-8b60-867d1089ad71,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap78310fa8-21')#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.404 187212 INFO nova.virt.libvirt.driver [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Deleting instance files /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012_del#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.405 187212 INFO nova.virt.libvirt.driver [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Deletion of /var/lib/nova/instances/adc15883-b705-42dd-ac95-04f4b8964012_del complete#033[00m
Dec  5 07:02:30 np0005546909 podman[218515]: 2025-12-05 12:02:30.419873593 +0000 UTC m=+0.045918343 container remove f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:02:30 np0005546909 podman[218479]: 2025-12-05 12:02:30.420631934 +0000 UTC m=+0.102689704 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.425 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a1bcd4-2b18-45ce-a2a1-efa64eb07322]: (4, ('Fri Dec  5 12:02:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 (f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751)\nf764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751\nFri Dec  5 12:02:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 (f764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751)\nf764b3445d4ab4f395976856d8054fede9716e02190b1696b8a3ea1e47119751\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.427 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[24c2d4f4-a800-420a-9690-b7b8983ca63c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.428 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap393d33f9-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.430 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:30 np0005546909 kernel: tap393d33f9-20: left promiscuous mode
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.431 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.435 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[afd47196-fd8a-4a32-bd5a-de296488c5fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.443 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.448 187212 INFO nova.compute.manager [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.449 187212 DEBUG oslo.service.loopingcall [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.449 187212 DEBUG nova.compute.manager [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:02:30 np0005546909 nova_compute[187208]: 2025-12-05 12:02:30.449 187212 DEBUG nova.network.neutron [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.452 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[67a1e61b-34cf-4118-841b-0b964c02a615]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.453 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2cea19e8-ff15-4ac0-b9bc-0fe5d52096de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.469 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ad284364-93ee-46d6-ad08-906fb866e5d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 348665, 'reachable_time': 44019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218546, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.471 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:02:30 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:30.471 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[11a414d8-54c2-4469-bd44-b92e198814cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:30 np0005546909 systemd[1]: run-netns-ovnmeta\x2d393d33f9\x2d2dde\x2d4fb5\x2db5db\x2d3f0fb98d4637.mount: Deactivated successfully.
Dec  5 07:02:31 np0005546909 nova_compute[187208]: 2025-12-05 12:02:31.075 187212 DEBUG nova.compute.manager [req-99cd9a85-4775-477f-a5e0-26366513ec29 req-2684cfce-b551-4ae7-a407-e9f5a6c2e835 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-vif-unplugged-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:31 np0005546909 nova_compute[187208]: 2025-12-05 12:02:31.076 187212 DEBUG oslo_concurrency.lockutils [req-99cd9a85-4775-477f-a5e0-26366513ec29 req-2684cfce-b551-4ae7-a407-e9f5a6c2e835 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:31 np0005546909 nova_compute[187208]: 2025-12-05 12:02:31.076 187212 DEBUG oslo_concurrency.lockutils [req-99cd9a85-4775-477f-a5e0-26366513ec29 req-2684cfce-b551-4ae7-a407-e9f5a6c2e835 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:31 np0005546909 nova_compute[187208]: 2025-12-05 12:02:31.076 187212 DEBUG oslo_concurrency.lockutils [req-99cd9a85-4775-477f-a5e0-26366513ec29 req-2684cfce-b551-4ae7-a407-e9f5a6c2e835 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:31 np0005546909 nova_compute[187208]: 2025-12-05 12:02:31.076 187212 DEBUG nova.compute.manager [req-99cd9a85-4775-477f-a5e0-26366513ec29 req-2684cfce-b551-4ae7-a407-e9f5a6c2e835 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] No waiting events found dispatching network-vif-unplugged-78310fa8-21e8-49e5-8b60-867d1089ad71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:31 np0005546909 nova_compute[187208]: 2025-12-05 12:02:31.076 187212 DEBUG nova.compute.manager [req-99cd9a85-4775-477f-a5e0-26366513ec29 req-2684cfce-b551-4ae7-a407-e9f5a6c2e835 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-vif-unplugged-78310fa8-21e8-49e5-8b60-867d1089ad71 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:02:32 np0005546909 nova_compute[187208]: 2025-12-05 12:02:32.524 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:32 np0005546909 nova_compute[187208]: 2025-12-05 12:02:32.566 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:32 np0005546909 nova_compute[187208]: 2025-12-05 12:02:32.871 187212 DEBUG nova.network.neutron [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Successfully updated port: 82089bf4-207e-4880-b8ff-9bf09a4ac3fb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:02:32 np0005546909 nova_compute[187208]: 2025-12-05 12:02:32.885 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "refresh_cache-9efa988a-19ae-440a-8a56-0bac68cb3c9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:32 np0005546909 nova_compute[187208]: 2025-12-05 12:02:32.886 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquired lock "refresh_cache-9efa988a-19ae-440a-8a56-0bac68cb3c9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:32 np0005546909 nova_compute[187208]: 2025-12-05 12:02:32.886 187212 DEBUG nova.network.neutron [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.153 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936138.15153, 982a8e69-5181-4847-bdfe-8d4de12bb2e4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.153 187212 INFO nova.compute.manager [-] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.176 187212 DEBUG nova.compute.manager [None req-4b354250-7e89-455a-b0ca-09794c64b7c5 - - - - - -] [instance: 982a8e69-5181-4847-bdfe-8d4de12bb2e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.419 187212 DEBUG nova.compute.manager [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.420 187212 DEBUG oslo_concurrency.lockutils [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "adc15883-b705-42dd-ac95-04f4b8964012-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.420 187212 DEBUG oslo_concurrency.lockutils [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.420 187212 DEBUG oslo_concurrency.lockutils [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.420 187212 DEBUG nova.compute.manager [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] No waiting events found dispatching network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.420 187212 WARNING nova.compute.manager [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received unexpected event network-vif-plugged-78310fa8-21e8-49e5-8b60-867d1089ad71 for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.421 187212 DEBUG nova.compute.manager [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received event network-changed-82089bf4-207e-4880-b8ff-9bf09a4ac3fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.421 187212 DEBUG nova.compute.manager [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Refreshing instance network info cache due to event network-changed-82089bf4-207e-4880-b8ff-9bf09a4ac3fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.421 187212 DEBUG oslo_concurrency.lockutils [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-9efa988a-19ae-440a-8a56-0bac68cb3c9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:33 np0005546909 nova_compute[187208]: 2025-12-05 12:02:33.446 187212 DEBUG nova.network.neutron [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.059 187212 DEBUG nova.network.neutron [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.086 187212 INFO nova.compute.manager [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Took 3.64 seconds to deallocate network for instance.#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.130 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.130 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.233 187212 DEBUG nova.compute.provider_tree [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.262 187212 DEBUG nova.scheduler.client.report [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.284 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.350 187212 INFO nova.scheduler.client.report [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Deleted allocations for instance adc15883-b705-42dd-ac95-04f4b8964012#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.426 187212 DEBUG oslo_concurrency.lockutils [None req-36668c74-f36b-4ffc-8c7a-fd7a9b981260 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "adc15883-b705-42dd-ac95-04f4b8964012" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.873 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.874 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.874 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.874 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.874 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.875 187212 INFO nova.compute.manager [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Terminating instance#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.876 187212 DEBUG nova.compute.manager [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:02:34 np0005546909 kernel: tapc72089e0-49 (unregistering): left promiscuous mode
Dec  5 07:02:34 np0005546909 NetworkManager[55691]: <info>  [1764936154.9026] device (tapc72089e0-49): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.914 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:34Z|00200|binding|INFO|Releasing lport c72089e0-4937-40b6-86b5-f9d6d0982058 from this chassis (sb_readonly=0)
Dec  5 07:02:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:34Z|00201|binding|INFO|Setting lport c72089e0-4937-40b6-86b5-f9d6d0982058 down in Southbound
Dec  5 07:02:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:34Z|00202|binding|INFO|Removing iface tapc72089e0-49 ovn-installed in OVS
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:34.923 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:73:d9 10.100.0.11'], port_security=['fa:16:3e:ea:73:d9 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1606eea3-5389-4437-b0f9-cfe6084d7871', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e3cd52d70d1a4be8ae891298ff7e1018', 'neutron:revision_number': '4', 'neutron:security_group_ids': '753f16cd-17e0-4f5a-8936-b01e8b5b8119', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.193'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1ba8a60-bda5-4c97-91b2-1ae7ea8aa092, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c72089e0-4937-40b6-86b5-f9d6d0982058) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:34.925 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c72089e0-4937-40b6-86b5-f9d6d0982058 in datapath 904b3233-fdc6-4df0-b02a-f30a1e47627b unbound from our chassis#033[00m
Dec  5 07:02:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:34.927 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 904b3233-fdc6-4df0-b02a-f30a1e47627b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:02:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:34.928 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e0fe1fab-7e91-455e-9385-a6746539a05a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:34.930 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b namespace which is not needed anymore#033[00m
Dec  5 07:02:34 np0005546909 nova_compute[187208]: 2025-12-05 12:02:34.934 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:34 np0005546909 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Dec  5 07:02:34 np0005546909 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000001a.scope: Consumed 12.719s CPU time.
Dec  5 07:02:34 np0005546909 systemd-machined[153543]: Machine qemu-30-instance-0000001a terminated.
Dec  5 07:02:35 np0005546909 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [NOTICE]   (217539) : haproxy version is 2.8.14-c23fe91
Dec  5 07:02:35 np0005546909 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [NOTICE]   (217539) : path to executable is /usr/sbin/haproxy
Dec  5 07:02:35 np0005546909 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [WARNING]  (217539) : Exiting Master process...
Dec  5 07:02:35 np0005546909 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [WARNING]  (217539) : Exiting Master process...
Dec  5 07:02:35 np0005546909 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [ALERT]    (217539) : Current worker (217541) exited with code 143 (Terminated)
Dec  5 07:02:35 np0005546909 neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b[217535]: [WARNING]  (217539) : All workers exited. Exiting... (0)
Dec  5 07:02:35 np0005546909 systemd[1]: libpod-98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1.scope: Deactivated successfully.
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:02:35 np0005546909 podman[218573]: 2025-12-05 12:02:35.062075941 +0000 UTC m=+0.044523213 container died 98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec  5 07:02:35 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1-userdata-shm.mount: Deactivated successfully.
Dec  5 07:02:35 np0005546909 systemd[1]: var-lib-containers-storage-overlay-9f782e29042a4d6587770f14d2c44c7361a37482c8593077cf3a706cbf5d68ff-merged.mount: Deactivated successfully.
Dec  5 07:02:35 np0005546909 NetworkManager[55691]: <info>  [1764936155.0974] manager: (tapc72089e0-49): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Dec  5 07:02:35 np0005546909 podman[218573]: 2025-12-05 12:02:35.108337992 +0000 UTC m=+0.090785254 container cleanup 98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:02:35 np0005546909 systemd[1]: libpod-conmon-98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1.scope: Deactivated successfully.
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.131 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.132 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.149 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.155 187212 INFO nova.virt.libvirt.driver [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Instance destroyed successfully.#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.156 187212 DEBUG nova.objects.instance [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lazy-loading 'resources' on Instance uuid 1606eea3-5389-4437-b0f9-cfe6084d7871 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.176 187212 DEBUG nova.virt.libvirt.vif [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:01:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-306695219',display_name='tempest-ServersTestManualDisk-server-306695219',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-306695219',id=26,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDBUGvAJW8rYe/hjaW/hFZe4neO1wzdrge/WiC/SnDk7t8/AXKetmZ8zo2NHECOEnhI/cR+zSyaxyLqYdEo4m6l7dGZQlwDucN9SIoLiq2LpSC0tXmPTDFsuOTXYjC2rzw==',key_name='tempest-keypair-2064130855',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:02:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e3cd52d70d1a4be8ae891298ff7e1018',ramdisk_id='',reservation_id='r-w3qpedx4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestManualDisk-1916815153',owner_user_name='tempest-ServersTestManualDisk-1916815153-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:02:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ff53b25ec85543eeb2bdea04a6eeaac4',uuid=1606eea3-5389-4437-b0f9-cfe6084d7871,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.176 187212 DEBUG nova.network.os_vif_util [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Converting VIF {"id": "c72089e0-4937-40b6-86b5-f9d6d0982058", "address": "fa:16:3e:ea:73:d9", "network": {"id": "904b3233-fdc6-4df0-b02a-f30a1e47627b", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-968186511-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e3cd52d70d1a4be8ae891298ff7e1018", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc72089e0-49", "ovs_interfaceid": "c72089e0-4937-40b6-86b5-f9d6d0982058", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.177 187212 DEBUG nova.network.os_vif_util [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.177 187212 DEBUG os_vif [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.179 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.179 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc72089e0-49, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.181 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:35 np0005546909 podman[218615]: 2025-12-05 12:02:35.18316417 +0000 UTC m=+0.044509043 container remove 98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.186 187212 INFO os_vif [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:73:d9,bridge_name='br-int',has_traffic_filtering=True,id=c72089e0-4937-40b6-86b5-f9d6d0982058,network=Network(904b3233-fdc6-4df0-b02a-f30a1e47627b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc72089e0-49')#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.187 187212 INFO nova.virt.libvirt.driver [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Deleting instance files /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871_del#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.188 187212 INFO nova.virt.libvirt.driver [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Deletion of /var/lib/nova/instances/1606eea3-5389-4437-b0f9-cfe6084d7871_del complete#033[00m
Dec  5 07:02:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.188 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c3dc55b8-381e-49ea-8236-a2cc6661a01f]: (4, ('Fri Dec  5 12:02:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b (98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1)\n98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1\nFri Dec  5 12:02:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b (98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1)\n98841e0871aedaaea52950d16e28c5c2a1aa65db6f7c8d0c0ff7b6d86ab842c1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.192 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[467066ee-eb7c-4989-830d-61dff951a6bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.193 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap904b3233-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:35 np0005546909 kernel: tap904b3233-f0: left promiscuous mode
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.195 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.206 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.209 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c12c75f7-c5e9-4d2c-afed-6bbed9665ccf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.218 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.218 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.224 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.224 187212 INFO nova.compute.claims [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:02:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.233 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[71ff184b-da5f-465b-86d4-e13d144c1ab6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.235 187212 INFO nova.compute.manager [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.235 187212 DEBUG oslo.service.loopingcall [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:02:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.235 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[84468348-a198-4008-8d79-b69054dd61d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.235 187212 DEBUG nova.compute.manager [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.235 187212 DEBUG nova.network.neutron [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:02:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.252 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bab32d45-1bb6-43eb-86d6-93778f0edc74]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349256, 'reachable_time': 30173, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218635, 'error': None, 'target': 'ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.255 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-904b3233-fdc6-4df0-b02a-f30a1e47627b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:02:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.255 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[4dcfa2a7-ccc6-4b65-9ab1-b250489116b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:35 np0005546909 systemd[1]: run-netns-ovnmeta\x2d904b3233\x2dfdc6\x2d4df0\x2db02a\x2df30a1e47627b.mount: Deactivated successfully.
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.354 187212 DEBUG nova.compute.provider_tree [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.360 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:02:35.361 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.369 187212 DEBUG nova.scheduler.client.report [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.393 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.394 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.441 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.442 187212 DEBUG nova.network.neutron [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.465 187212 INFO nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.503 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.613 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.615 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.615 187212 INFO nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Creating image(s)#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.616 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.616 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.617 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.630 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.662 187212 DEBUG nova.network.neutron [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Updating instance_info_cache with network_info: [{"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.666 187212 DEBUG nova.compute.manager [req-5f9fdda8-f52c-4d14-99eb-1c9167fd6a7f req-b15a3d5b-418e-41ec-b05f-40016a081e39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Received event network-vif-deleted-78310fa8-21e8-49e5-8b60-867d1089ad71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.692 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Releasing lock "refresh_cache-9efa988a-19ae-440a-8a56-0bac68cb3c9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.693 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance network_info: |[{"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.693 187212 DEBUG oslo_concurrency.lockutils [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-9efa988a-19ae-440a-8a56-0bac68cb3c9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.693 187212 DEBUG nova.network.neutron [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Refreshing network info cache for port 82089bf4-207e-4880-b8ff-9bf09a4ac3fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.697 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Start _get_guest_xml network_info=[{"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.700 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.701 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.702 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.718 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.755 187212 WARNING nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.766 187212 DEBUG nova.virt.libvirt.host [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.766 187212 DEBUG nova.virt.libvirt.host [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.770 187212 DEBUG nova.virt.libvirt.host [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.770 187212 DEBUG nova.virt.libvirt.host [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.771 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.771 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.772 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.772 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.772 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.772 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.772 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.773 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.773 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.773 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.773 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.773 187212 DEBUG nova.virt.hardware [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.777 187212 DEBUG nova.virt.libvirt.vif [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1705515982',display_name='tempest-ImagesTestJSON-server-1705515982',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1705515982',id=28,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-9ph9qh0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:27Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=9efa988a-19ae-440a-8a56-0bac68cb3c9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.777 187212 DEBUG nova.network.os_vif_util [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.778 187212 DEBUG nova.network.os_vif_util [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.779 187212 DEBUG nova.objects.instance [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9efa988a-19ae-440a-8a56-0bac68cb3c9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.789 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.789 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.810 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  <uuid>9efa988a-19ae-440a-8a56-0bac68cb3c9e</uuid>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  <name>instance-0000001c</name>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <nova:name>tempest-ImagesTestJSON-server-1705515982</nova:name>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:02:35</nova:creationTime>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:        <nova:user uuid="a00ac4435e6647779ffaf4a5cde18fdb">tempest-ImagesTestJSON-276789408-project-member</nova:user>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:        <nova:project uuid="43e63f5c6b0f4840ad4df23fb5c10764">tempest-ImagesTestJSON-276789408</nova:project>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:        <nova:port uuid="82089bf4-207e-4880-b8ff-9bf09a4ac3fb">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <entry name="serial">9efa988a-19ae-440a-8a56-0bac68cb3c9e</entry>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <entry name="uuid">9efa988a-19ae-440a-8a56-0bac68cb3c9e</entry>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.config"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:53:25:56"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <target dev="tap82089bf4-20"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/console.log" append="off"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:02:35 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:02:35 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:02:35 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:02:35 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.812 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Preparing to wait for external event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.813 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.813 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.813 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.814 187212 DEBUG nova.virt.libvirt.vif [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1705515982',display_name='tempest-ImagesTestJSON-server-1705515982',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1705515982',id=28,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-9ph9qh0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:27Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=9efa988a-19ae-440a-8a56-0bac68cb3c9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.815 187212 DEBUG nova.network.os_vif_util [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.815 187212 DEBUG nova.network.os_vif_util [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.816 187212 DEBUG os_vif [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.816 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.817 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.817 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.820 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.821 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82089bf4-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.821 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82089bf4-20, col_values=(('external_ids', {'iface-id': '82089bf4-207e-4880-b8ff-9bf09a4ac3fb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:53:25:56', 'vm-uuid': '9efa988a-19ae-440a-8a56-0bac68cb3c9e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:35 np0005546909 NetworkManager[55691]: <info>  [1764936155.8244] manager: (tap82089bf4-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.827 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.829 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.831 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.832 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.851 187212 INFO os_vif [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20')#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.889 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.890 187212 DEBUG nova.virt.disk.api [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Checking if we can resize image /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.891 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.891 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:35.893 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.909 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.936 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.936 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.936 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No VIF found with MAC fa:16:3e:53:25:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.937 187212 INFO nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Using config drive#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.944 187212 DEBUG nova.policy [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '79758a6c7516459bb1907270241d266a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '342e6d694cf6482c9f1b7557a17bce60', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.960 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.961 187212 DEBUG nova.virt.disk.api [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Cannot resize image /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.962 187212 DEBUG nova.objects.instance [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lazy-loading 'migration_context' on Instance uuid bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.978 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.979 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.980 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.980 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.981 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:35 np0005546909 nova_compute[187208]: 2025-12-05 12:02:35.981 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.006 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.007 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.050 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.051 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.064 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.121 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.126 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.127 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.128 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.145 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.200 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.201 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ephemeral_1_0706d66,backing_fmt=raw /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.eph0 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.233 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ephemeral_1_0706d66,backing_fmt=raw /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.eph0 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.234 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.234 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.297 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.298 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.298 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Ensure instance console log exists: /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.299 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.299 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.299 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.461 187212 DEBUG nova.network.neutron [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.477 187212 INFO nova.compute.manager [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Took 1.24 seconds to deallocate network for instance.#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.591 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.592 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.683 187212 DEBUG nova.compute.provider_tree [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.698 187212 DEBUG nova.scheduler.client.report [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.720 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.757 187212 INFO nova.scheduler.client.report [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Deleted allocations for instance 1606eea3-5389-4437-b0f9-cfe6084d7871#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.837 187212 DEBUG nova.compute.manager [req-fac4c2fb-eeca-4675-85cf-78e81a664d28 req-94089a6b-0687-4204-abd7-2ac25bc5073b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-vif-deleted-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.840 187212 DEBUG oslo_concurrency.lockutils [None req-db9bf56b-5349-4132-aed6-43cfa6ebceda ff53b25ec85543eeb2bdea04a6eeaac4 e3cd52d70d1a4be8ae891298ff7e1018 - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.967s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:36.897 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.902 187212 INFO nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Creating config drive at /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.config#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.906 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ersl1ph execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.925 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936141.9238977, 795a269a-5af9-4e6a-bf1f-e2bb83634855 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.926 187212 INFO nova.compute.manager [-] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:02:36 np0005546909 nova_compute[187208]: 2025-12-05 12:02:36.951 187212 DEBUG nova.compute.manager [None req-a41ee703-2c98-4540-9e13-a6d8a24dd1a5 - - - - - -] [instance: 795a269a-5af9-4e6a-bf1f-e2bb83634855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.034 187212 DEBUG oslo_concurrency.processutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ersl1ph" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:02:37 np0005546909 kernel: tap82089bf4-20: entered promiscuous mode
Dec  5 07:02:37 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:37Z|00203|binding|INFO|Claiming lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb for this chassis.
Dec  5 07:02:37 np0005546909 systemd-udevd[218553]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:02:37 np0005546909 NetworkManager[55691]: <info>  [1764936157.1267] manager: (tap82089bf4-20): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Dec  5 07:02:37 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:37Z|00204|binding|INFO|82089bf4-207e-4880-b8ff-9bf09a4ac3fb: Claiming fa:16:3e:53:25:56 10.100.0.7
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.128 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.136 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:25:56 10.100.0.7'], port_security=['fa:16:3e:53:25:56 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9efa988a-19ae-440a-8a56-0bac68cb3c9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=82089bf4-207e-4880-b8ff-9bf09a4ac3fb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.138 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 82089bf4-207e-4880-b8ff-9bf09a4ac3fb in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.140 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd#033[00m
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.144 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:37 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:37Z|00205|binding|INFO|Setting lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb ovn-installed in OVS
Dec  5 07:02:37 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:37Z|00206|binding|INFO|Setting lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb up in Southbound
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.147 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:37 np0005546909 NetworkManager[55691]: <info>  [1764936157.1542] device (tap82089bf4-20): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:02:37 np0005546909 NetworkManager[55691]: <info>  [1764936157.1552] device (tap82089bf4-20): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.150 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[272335ef-db9f-4114-81e5-c34568bb286b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.151 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41b3b495-c1 in ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.157 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41b3b495-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.157 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a305e5aa-00f4-49a4-9b12-6f1a8aafb1b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.158 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b5283076-864c-4047-bebf-4e9f57a6eeeb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.170 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b13267c0-97f5-4d14-9a01-4063a7518ae7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 systemd-machined[153543]: New machine qemu-32-instance-0000001c.
Dec  5 07:02:37 np0005546909 systemd[1]: Started Virtual Machine qemu-32-instance-0000001c.
Dec  5 07:02:37 np0005546909 podman[218681]: 2025-12-05 12:02:37.196327132 +0000 UTC m=+0.079320067 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.199 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[54957442-e824-44f5-83fc-4e8eb45ac0f8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.237 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[49157da1-adc4-473a-8a41-116aa4b91abe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.246 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca04bc0-7b2a-4059-b72c-1d86fb13393f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 NetworkManager[55691]: <info>  [1764936157.2467] manager: (tap41b3b495-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.272 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3d8938c0-5699-4433-9479-16a6e7ac08c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.274 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc15d67-69f8-45de-bf83-ebefba5ea24b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 NetworkManager[55691]: <info>  [1764936157.2943] device (tap41b3b495-c0): carrier: link connected
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.299 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[81e0ae70-2e41-4435-a08a-5920da16755b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.314 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f557dd8-b923-4118-b6e4-3f8afd4386b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353785, 'reachable_time': 36100, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218743, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.330 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[86a6c34e-30cb-4647-ab5e-1acb7bb5c3dc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:a102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353785, 'tstamp': 353785}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218744, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.344 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff6eddd-beb1-460b-a29f-d07310db2ae3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353785, 'reachable_time': 36100, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218745, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.367 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[69615094-f217-47d8-bcd4-76a095b3ef98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.417 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6b5c9de8-ceab-4749-ac42-f60d6093b68c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.419 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.419 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.420 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:37 np0005546909 NetworkManager[55691]: <info>  [1764936157.4234] manager: (tap41b3b495-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.422 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:37 np0005546909 kernel: tap41b3b495-c0: entered promiscuous mode
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.437 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.439 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.440 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:37 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:37Z|00207|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.451 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.453 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.454 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[78194e8f-ce7b-49f1-9eb1-ce959e418412]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.455 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:02:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:37.456 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'env', 'PROCESS_TAG=haproxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41b3b495-c1c9-44c0-b1a3-a499df6548dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.526 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:37 np0005546909 podman[218779]: 2025-12-05 12:02:37.806638144 +0000 UTC m=+0.049942777 container create 5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.828 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936157.8281279, 9efa988a-19ae-440a-8a56-0bac68cb3c9e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.829 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] VM Started (Lifecycle Event)#033[00m
Dec  5 07:02:37 np0005546909 systemd[1]: Started libpod-conmon-5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5.scope.
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.848 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.852 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936157.828551, 9efa988a-19ae-440a-8a56-0bac68cb3c9e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.852 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.867 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:37 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.870 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:02:37 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df6dbe72f381b99c1a982824974be4cad9c714499ad9f3e0b3b9579fdd54f357/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:02:37 np0005546909 podman[218779]: 2025-12-05 12:02:37.779488069 +0000 UTC m=+0.022792712 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:02:37 np0005546909 podman[218779]: 2025-12-05 12:02:37.887119843 +0000 UTC m=+0.130424476 container init 5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec  5 07:02:37 np0005546909 nova_compute[187208]: 2025-12-05 12:02:37.890 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:02:37 np0005546909 podman[218779]: 2025-12-05 12:02:37.892839436 +0000 UTC m=+0.136144069 container start 5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:02:37 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [NOTICE]   (218801) : New worker (218803) forked
Dec  5 07:02:37 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [NOTICE]   (218801) : Loading success.
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.092 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.093 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.093 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.093 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.184 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.221 187212 DEBUG nova.compute.manager [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-vif-unplugged-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.222 187212 DEBUG oslo_concurrency.lockutils [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.222 187212 DEBUG oslo_concurrency.lockutils [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.223 187212 DEBUG oslo_concurrency.lockutils [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.223 187212 DEBUG nova.compute.manager [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] No waiting events found dispatching network-vif-unplugged-c72089e0-4937-40b6-86b5-f9d6d0982058 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.223 187212 WARNING nova.compute.manager [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received unexpected event network-vif-unplugged-c72089e0-4937-40b6-86b5-f9d6d0982058 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.224 187212 DEBUG nova.compute.manager [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.224 187212 DEBUG oslo_concurrency.lockutils [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.224 187212 DEBUG oslo_concurrency.lockutils [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.225 187212 DEBUG oslo_concurrency.lockutils [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "1606eea3-5389-4437-b0f9-cfe6084d7871-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.225 187212 DEBUG nova.compute.manager [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] No waiting events found dispatching network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.225 187212 WARNING nova.compute.manager [req-92ba0e7b-754f-4318-a109-22f18c088755 req-47f5588c-c71c-436a-b8fa-f539ed0b75cb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Received unexpected event network-vif-plugged-c72089e0-4937-40b6-86b5-f9d6d0982058 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.253 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.254 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.308 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.318 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.318 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.332 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.404 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.405 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.410 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.410 187212 INFO nova.compute.claims [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.497 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.498 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5596MB free_disk=73.29545211791992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.498 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.590 187212 DEBUG nova.compute.provider_tree [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.644 187212 DEBUG nova.scheduler.client.report [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.678 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.679 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.681 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.732 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.733 187212 DEBUG nova.network.neutron [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.746 187212 INFO nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.761 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.763 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 9efa988a-19ae-440a-8a56-0bac68cb3c9e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.764 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.764 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.764 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.764 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.840 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.841 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.841 187212 INFO nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Creating image(s)#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.842 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "/var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.842 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "/var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.843 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "/var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.856 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.882 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.900 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.924 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.925 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.926 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.927 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.928 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:38 np0005546909 nova_compute[187208]: 2025-12-05 12:02:38.944 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.026 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.027 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.069 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.070 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.070 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.131 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.133 187212 DEBUG nova.virt.disk.api [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Checking if we can resize image /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.134 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.188 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.190 187212 DEBUG nova.virt.disk.api [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Cannot resize image /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.190 187212 DEBUG nova.objects.instance [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lazy-loading 'migration_context' on Instance uuid fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.206 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.207 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Ensure instance console log exists: /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.207 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.208 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.208 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.216 187212 DEBUG nova.network.neutron [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Updated VIF entry in instance network info cache for port 82089bf4-207e-4880-b8ff-9bf09a4ac3fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.217 187212 DEBUG nova.network.neutron [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Updating instance_info_cache with network_info: [{"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.233 187212 DEBUG oslo_concurrency.lockutils [req-0cbaea1e-5575-4569-ba70-4b5afdf50a1f req-460f347c-5519-4853-9fb0-5255fcd9ef19 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-9efa988a-19ae-440a-8a56-0bac68cb3c9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.556 187212 DEBUG nova.policy [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e4a0640c63a14775b62a4d40c4860519', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6a2f4fffdace4b2fa0e0b6cdfc1055f5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.788 187212 DEBUG nova.network.neutron [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Successfully created port: b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.926 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.927 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:02:39 np0005546909 nova_compute[187208]: 2025-12-05 12:02:39.927 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.333 187212 DEBUG nova.compute.manager [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.334 187212 DEBUG oslo_concurrency.lockutils [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.334 187212 DEBUG oslo_concurrency.lockutils [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.334 187212 DEBUG oslo_concurrency.lockutils [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.335 187212 DEBUG nova.compute.manager [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Processing event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.335 187212 DEBUG nova.compute.manager [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.335 187212 DEBUG oslo_concurrency.lockutils [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.336 187212 DEBUG oslo_concurrency.lockutils [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.336 187212 DEBUG oslo_concurrency.lockutils [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.336 187212 DEBUG nova.compute.manager [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] No waiting events found dispatching network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.336 187212 WARNING nova.compute.manager [req-ad936839-736b-4a1b-8602-633269962ec9 req-a48aa57c-ff77-41f4-a19a-ec250e15aa7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received unexpected event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.338 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.343 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.343 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936160.3430412, 9efa988a-19ae-440a-8a56-0bac68cb3c9e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.344 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.347 187212 INFO nova.virt.libvirt.driver [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance spawned successfully.#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.348 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.368 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.370 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.375 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.378 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.378 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.379 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.379 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.380 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.380 187212 DEBUG nova.virt.libvirt.driver [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.413 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.455 187212 INFO nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Took 13.11 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.455 187212 DEBUG nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.529 187212 INFO nova.compute.manager [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Took 13.68 seconds to build instance.#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.549 187212 DEBUG oslo_concurrency.lockutils [None req-af697095-53f9-47db-b1d7-34fc3f214282 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.770s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:40 np0005546909 nova_compute[187208]: 2025-12-05 12:02:40.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:41 np0005546909 nova_compute[187208]: 2025-12-05 12:02:41.601 187212 DEBUG nova.network.neutron [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Successfully created port: d067fc33-ba4d-48f6-98f5-51ebca4adbc5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:02:41 np0005546909 nova_compute[187208]: 2025-12-05 12:02:41.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:42 np0005546909 nova_compute[187208]: 2025-12-05 12:02:42.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:02:42 np0005546909 nova_compute[187208]: 2025-12-05 12:02:42.529 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:42 np0005546909 nova_compute[187208]: 2025-12-05 12:02:42.902 187212 DEBUG nova.network.neutron [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Successfully updated port: b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:02:42 np0005546909 nova_compute[187208]: 2025-12-05 12:02:42.918 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:42 np0005546909 nova_compute[187208]: 2025-12-05 12:02:42.919 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquired lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:42 np0005546909 nova_compute[187208]: 2025-12-05 12:02:42.919 187212 DEBUG nova.network.neutron [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:02:43 np0005546909 nova_compute[187208]: 2025-12-05 12:02:43.214 187212 DEBUG nova.compute.manager [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-changed-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:43 np0005546909 nova_compute[187208]: 2025-12-05 12:02:43.214 187212 DEBUG nova.compute.manager [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Refreshing instance network info cache due to event network-changed-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:02:43 np0005546909 nova_compute[187208]: 2025-12-05 12:02:43.215 187212 DEBUG oslo_concurrency.lockutils [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:43 np0005546909 nova_compute[187208]: 2025-12-05 12:02:43.258 187212 DEBUG nova.network.neutron [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.181 187212 DEBUG oslo_concurrency.lockutils [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.182 187212 DEBUG oslo_concurrency.lockutils [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.182 187212 DEBUG nova.compute.manager [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.186 187212 DEBUG nova.compute.manager [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.186 187212 DEBUG nova.objects.instance [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'flavor' on Instance uuid 9efa988a-19ae-440a-8a56-0bac68cb3c9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.213 187212 DEBUG nova.virt.libvirt.driver [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.532 187212 DEBUG nova.network.neutron [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Successfully updated port: d067fc33-ba4d-48f6-98f5-51ebca4adbc5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.552 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "refresh_cache-fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.553 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquired lock "refresh_cache-fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.553 187212 DEBUG nova.network.neutron [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:02:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:44Z|00208|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.626 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:44Z|00209|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:44 np0005546909 nova_compute[187208]: 2025-12-05 12:02:44.915 187212 DEBUG nova.network.neutron [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.030 187212 DEBUG nova.network.neutron [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Updating instance_info_cache with network_info: [{"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.067 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Releasing lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.067 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Instance network_info: |[{"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.068 187212 DEBUG oslo_concurrency.lockutils [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.068 187212 DEBUG nova.network.neutron [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Refreshing network info cache for port b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.072 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Start _get_guest_xml network_info=[{"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 1, 'encrypted': False, 'device_name': '/dev/vdb', 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio'}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.076 187212 WARNING nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.080 187212 DEBUG nova.virt.libvirt.host [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.081 187212 DEBUG nova.virt.libvirt.host [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.084 187212 DEBUG nova.virt.libvirt.host [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.084 187212 DEBUG nova.virt.libvirt.host [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.085 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.085 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T12:01:20Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={hw_rng:allowed='True'},flavorid='307317883',id=25,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_1-1528646215',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.086 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.086 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.086 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.087 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.087 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.087 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.088 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.088 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.088 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.089 187212 DEBUG nova.virt.hardware [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.093 187212 DEBUG nova.virt.libvirt.vif [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-479694898',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-479694898',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(25),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-479694898',id=29,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=25,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHclZt3lDeuFOP8poKE+ML8+DG1Fbw3aUsTnjf0HLJVz5RSbJGx4tv2GGPcCJx4ta3mNRAE5Oj+av9qQ6qgWWoPyu4x9SJdJ+NWU4lkfCG3kIVf4et9X/7mGn0JPIZgI2A==',key_name='tempest-keypair-270659961',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='342e6d694cf6482c9f1b7557a17bce60',ramdisk_id='',reservation_id='r-70canao4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79758a6c7516459bb1907270241d266a',uuid=bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.093 187212 DEBUG nova.network.os_vif_util [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converting VIF {"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.094 187212 DEBUG nova.network.os_vif_util [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.095 187212 DEBUG nova.objects.instance [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lazy-loading 'pci_devices' on Instance uuid bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.118 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  <uuid>bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77</uuid>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  <name>instance-0000001d</name>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-479694898</nova:name>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:02:45</nova:creationTime>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <nova:flavor name="tempest-flavor_with_ephemeral_1-1528646215">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:        <nova:ephemeral>1</nova:ephemeral>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:        <nova:user uuid="79758a6c7516459bb1907270241d266a">tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member</nova:user>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:        <nova:project uuid="342e6d694cf6482c9f1b7557a17bce60">tempest-ServersWithSpecificFlavorTestJSON-1976479976</nova:project>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:        <nova:port uuid="b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <entry name="serial">bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77</entry>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <entry name="uuid">bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77</entry>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.eph0"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <target dev="vdb" bus="virtio"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.config"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:2e:33:fe"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <target dev="tapb5ee44c8-34"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/console.log" append="off"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:02:45 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:02:45 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:02:45 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:02:45 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.120 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Preparing to wait for external event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.120 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.121 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.121 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.122 187212 DEBUG nova.virt.libvirt.vif [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-479694898',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-479694898',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(25),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-479694898',id=29,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=25,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHclZt3lDeuFOP8poKE+ML8+DG1Fbw3aUsTnjf0HLJVz5RSbJGx4tv2GGPcCJx4ta3mNRAE5Oj+av9qQ6qgWWoPyu4x9SJdJ+NWU4lkfCG3kIVf4et9X/7mGn0JPIZgI2A==',key_name='tempest-keypair-270659961',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='342e6d694cf6482c9f1b7557a17bce60',ramdisk_id='',reservation_id='r-70canao4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79758a6c7516459bb1907270241d266a',uuid=bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.122 187212 DEBUG nova.network.os_vif_util [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converting VIF {"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.123 187212 DEBUG nova.network.os_vif_util [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.123 187212 DEBUG os_vif [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.124 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.124 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.124 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.127 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.127 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5ee44c8-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.128 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5ee44c8-34, col_values=(('external_ids', {'iface-id': 'b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:33:fe', 'vm-uuid': 'bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.130 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:45 np0005546909 NetworkManager[55691]: <info>  [1764936165.1307] manager: (tapb5ee44c8-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.133 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.141 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.142 187212 INFO os_vif [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34')#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.367 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.368 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.368 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.368 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] No VIF found with MAC fa:16:3e:2e:33:fe, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.369 187212 INFO nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Using config drive#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.379 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936150.3783724, adc15883-b705-42dd-ac95-04f4b8964012 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.380 187212 INFO nova.compute.manager [-] [instance: adc15883-b705-42dd-ac95-04f4b8964012] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.398 187212 DEBUG nova.compute.manager [None req-e5e43849-eccc-47f7-b43b-c1a21d7eaffe - - - - - -] [instance: adc15883-b705-42dd-ac95-04f4b8964012] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.901 187212 INFO nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Creating config drive at /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.config#033[00m
Dec  5 07:02:45 np0005546909 nova_compute[187208]: 2025-12-05 12:02:45.907 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpextcoanm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.035 187212 DEBUG oslo_concurrency.processutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpextcoanm" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:46 np0005546909 kernel: tapb5ee44c8-34: entered promiscuous mode
Dec  5 07:02:46 np0005546909 NetworkManager[55691]: <info>  [1764936166.1260] manager: (tapb5ee44c8-34): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.137 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.141 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:46 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:46Z|00210|binding|INFO|Claiming lport b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff for this chassis.
Dec  5 07:02:46 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:46Z|00211|binding|INFO|b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff: Claiming fa:16:3e:2e:33:fe 10.100.0.9
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.151 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:33:fe 10.100.0.9'], port_security=['fa:16:3e:2e:33:fe 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '342e6d694cf6482c9f1b7557a17bce60', 'neutron:revision_number': '2', 'neutron:security_group_ids': '710ea28e-d1ba-4c63-a751-16b460b2129b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a85cd729-c72e-4d3c-b444-ff0b42d436ff, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.152 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff in datapath 393d33f9-2dde-4fb5-b5db-3f0fb98d4637 bound to our chassis#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.154 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 393d33f9-2dde-4fb5-b5db-3f0fb98d4637#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.167 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bca352b9-6192-4896-abaa-f5d153379bc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.168 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap393d33f9-21 in ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.171 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap393d33f9-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.171 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[febee430-1b05-4cad-97d1-36cddfbf0d25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.173 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[974287bd-cabb-48b0-8bb0-5cdecbcc302b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 systemd-udevd[218869]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.184 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[4ebbf65a-3ea3-4beb-9780-394d24abe194]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 systemd-machined[153543]: New machine qemu-33-instance-0000001d.
Dec  5 07:02:46 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:46Z|00212|binding|INFO|Setting lport b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff ovn-installed in OVS
Dec  5 07:02:46 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:46Z|00213|binding|INFO|Setting lport b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff up in Southbound
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.196 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:46 np0005546909 NetworkManager[55691]: <info>  [1764936166.2006] device (tapb5ee44c8-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:02:46 np0005546909 systemd[1]: Started Virtual Machine qemu-33-instance-0000001d.
Dec  5 07:02:46 np0005546909 NetworkManager[55691]: <info>  [1764936166.2014] device (tapb5ee44c8-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.208 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[887b6b1e-b724-4571-9643-8b9e5775bf0f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 podman[218850]: 2025-12-05 12:02:46.218590917 +0000 UTC m=+0.107007608 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.241 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[509f4b85-1858-4abd-a6de-f23579b4a14a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.262 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4d2fb4-494e-4174-9bfb-4f15515c8eb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 NetworkManager[55691]: <info>  [1764936166.2653] manager: (tap393d33f9-20): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.295 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2b3ec3fc-e25e-416a-850a-23a8256664c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.299 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[02ab0d77-8d4f-49a5-bffe-2a37f39a6b7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 NetworkManager[55691]: <info>  [1764936166.3213] device (tap393d33f9-20): carrier: link connected
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.327 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[55136a13-d4b8-40b6-a343-e842315e32e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.342 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f16fde2-cedb-482b-b686-c73a97093988]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap393d33f9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:b1:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354688, 'reachable_time': 36852, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218909, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.359 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72b76ae2-6228-480e-aa05-c133b6fd235a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe23:b198'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 354688, 'tstamp': 354688}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218910, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.373 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f3557c0-d83a-46e0-a87f-fb3e504258ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap393d33f9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:23:b1:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354688, 'reachable_time': 36852, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218911, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.399 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c037f1-da00-461e-92e7-0ebed0b93076]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.458 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c978c579-7032-4e29-bb61-2c962f9655f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.460 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap393d33f9-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.460 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.460 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap393d33f9-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:46 np0005546909 kernel: tap393d33f9-20: entered promiscuous mode
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.462 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:46 np0005546909 NetworkManager[55691]: <info>  [1764936166.4654] manager: (tap393d33f9-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.466 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.467 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap393d33f9-20, col_values=(('external_ids', {'iface-id': '4f5e3c8a-5273-4414-820c-16ae051153f4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:46 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:46Z|00214|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.472 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.472 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8152e76d-7945-4402-b84a-d01feea87b77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.473 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-393d33f9-2dde-4fb5-b5db-3f0fb98d4637
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.pid.haproxy
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 393d33f9-2dde-4fb5-b5db-3f0fb98d4637
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:02:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:46.474 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'env', 'PROCESS_TAG=haproxy-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/393d33f9-2dde-4fb5-b5db-3f0fb98d4637.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.479 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.508 187212 DEBUG nova.compute.manager [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Received event network-changed-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.509 187212 DEBUG nova.compute.manager [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Refreshing instance network info cache due to event network-changed-d067fc33-ba4d-48f6-98f5-51ebca4adbc5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.510 187212 DEBUG oslo_concurrency.lockutils [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.555 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936166.554855, bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.555 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] VM Started (Lifecycle Event)#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.574 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.587 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936166.5555856, bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.587 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.609 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.612 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.630 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.868 187212 DEBUG nova.network.neutron [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Updating instance_info_cache with network_info: [{"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:46 np0005546909 podman[218950]: 2025-12-05 12:02:46.882049557 +0000 UTC m=+0.108418817 container create 1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec  5 07:02:46 np0005546909 podman[218950]: 2025-12-05 12:02:46.794828236 +0000 UTC m=+0.021197506 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:02:46 np0005546909 systemd[1]: Started libpod-conmon-1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5.scope.
Dec  5 07:02:46 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:02:46 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61c2e9c401d8673a2e5697d0fcec7ebf4e56c09bb215df29178f1e818f6cd815/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.961 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Releasing lock "refresh_cache-fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.961 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Instance network_info: |[{"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.962 187212 DEBUG oslo_concurrency.lockutils [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.962 187212 DEBUG nova.network.neutron [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Refreshing network info cache for port d067fc33-ba4d-48f6-98f5-51ebca4adbc5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.965 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Start _get_guest_xml network_info=[{"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.970 187212 WARNING nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.976 187212 DEBUG nova.virt.libvirt.host [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.976 187212 DEBUG nova.virt.libvirt.host [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.981 187212 DEBUG nova.virt.libvirt.host [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.982 187212 DEBUG nova.virt.libvirt.host [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.982 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.982 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.982 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.983 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.983 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.983 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.983 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.983 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.983 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.984 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.984 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.984 187212 DEBUG nova.virt.hardware [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.987 187212 DEBUG nova.virt.libvirt.vif [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1938885940',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1938885940',id=30,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6a2f4fffdace4b2fa0e0b6cdfc1055f5',ramdisk_id='',reservation_id='r-n0y57b39',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-44253202',owner_user_name='tempest-InstanceActionsV221TestJSON-44253202-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:38Z,user_data=None,user_id='e4a0640c63a14775b62a4d40c4860519',uuid=fe8aefc3-96cb-4d4e-a684-1453a7df2fa1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.988 187212 DEBUG nova.network.os_vif_util [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Converting VIF {"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.988 187212 DEBUG nova.network.os_vif_util [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:46 np0005546909 nova_compute[187208]: 2025-12-05 12:02:46.989 187212 DEBUG nova.objects.instance [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.004 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  <uuid>fe8aefc3-96cb-4d4e-a684-1453a7df2fa1</uuid>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  <name>instance-0000001e</name>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <nova:name>tempest-InstanceActionsV221TestJSON-server-1938885940</nova:name>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:02:46</nova:creationTime>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:        <nova:user uuid="e4a0640c63a14775b62a4d40c4860519">tempest-InstanceActionsV221TestJSON-44253202-project-member</nova:user>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:        <nova:project uuid="6a2f4fffdace4b2fa0e0b6cdfc1055f5">tempest-InstanceActionsV221TestJSON-44253202</nova:project>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:        <nova:port uuid="d067fc33-ba4d-48f6-98f5-51ebca4adbc5">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <entry name="serial">fe8aefc3-96cb-4d4e-a684-1453a7df2fa1</entry>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <entry name="uuid">fe8aefc3-96cb-4d4e-a684-1453a7df2fa1</entry>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.config"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:cf:98:15"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <target dev="tapd067fc33-ba"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/console.log" append="off"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:02:47 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:02:47 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:02:47 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:02:47 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.004 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Preparing to wait for external event network-vif-plugged-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.004 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.005 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.005 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.005 187212 DEBUG nova.virt.libvirt.vif [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1938885940',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1938885940',id=30,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6a2f4fffdace4b2fa0e0b6cdfc1055f5',ramdisk_id='',reservation_id='r-n0y57b39',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-44253202',owner_user_name='tempest-InstanceActionsV221TestJSON-44253202-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:38Z,user_data=None,user_id='e4a0640c63a14775b62a4d40c4860519',uuid=fe8aefc3-96cb-4d4e-a684-1453a7df2fa1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.006 187212 DEBUG nova.network.os_vif_util [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Converting VIF {"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.006 187212 DEBUG nova.network.os_vif_util [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.006 187212 DEBUG os_vif [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.007 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.007 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.008 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.011 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.011 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd067fc33-ba, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.012 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd067fc33-ba, col_values=(('external_ids', {'iface-id': 'd067fc33-ba4d-48f6-98f5-51ebca4adbc5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:98:15', 'vm-uuid': 'fe8aefc3-96cb-4d4e-a684-1453a7df2fa1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.013 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:47 np0005546909 NetworkManager[55691]: <info>  [1764936167.0143] manager: (tapd067fc33-ba): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.016 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.020 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.021 187212 INFO os_vif [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba')#033[00m
Dec  5 07:02:47 np0005546909 podman[218950]: 2025-12-05 12:02:47.035951713 +0000 UTC m=+0.262321003 container init 1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:02:47 np0005546909 podman[218950]: 2025-12-05 12:02:47.041571964 +0000 UTC m=+0.267941224 container start 1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:02:47 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [NOTICE]   (218972) : New worker (218974) forked
Dec  5 07:02:47 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [NOTICE]   (218972) : Loading success.
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.077 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.078 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.078 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] No VIF found with MAC fa:16:3e:cf:98:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.079 187212 INFO nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Using config drive#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.530 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.612 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.612 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.628 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.722 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.722 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.728 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.728 187212 INFO nova.compute.claims [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.901 187212 DEBUG nova.compute.provider_tree [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.906 187212 INFO nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Creating config drive at /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.config#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.910 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeyo1hah9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.933 187212 DEBUG nova.scheduler.client.report [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.959 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:47 np0005546909 nova_compute[187208]: 2025-12-05 12:02:47.959 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.005 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.005 187212 DEBUG nova.network.neutron [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.030 187212 INFO nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.041 187212 DEBUG oslo_concurrency.processutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeyo1hah9" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.048 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:02:48 np0005546909 kernel: tapd067fc33-ba: entered promiscuous mode
Dec  5 07:02:48 np0005546909 NetworkManager[55691]: <info>  [1764936168.1057] manager: (tapd067fc33-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Dec  5 07:02:48 np0005546909 systemd-udevd[218902]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.114 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.117 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:48Z|00215|binding|INFO|Claiming lport d067fc33-ba4d-48f6-98f5-51ebca4adbc5 for this chassis.
Dec  5 07:02:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:48Z|00216|binding|INFO|d067fc33-ba4d-48f6-98f5-51ebca4adbc5: Claiming fa:16:3e:cf:98:15 10.100.0.10
Dec  5 07:02:48 np0005546909 NetworkManager[55691]: <info>  [1764936168.1251] device (tapd067fc33-ba): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:02:48 np0005546909 NetworkManager[55691]: <info>  [1764936168.1264] device (tapd067fc33-ba): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:02:48 np0005546909 systemd-machined[153543]: New machine qemu-34-instance-0000001e.
Dec  5 07:02:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:48Z|00217|binding|INFO|Setting lport d067fc33-ba4d-48f6-98f5-51ebca4adbc5 ovn-installed in OVS
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.171 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:48 np0005546909 systemd[1]: Started Virtual Machine qemu-34-instance-0000001e.
Dec  5 07:02:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:48Z|00218|binding|INFO|Setting lport d067fc33-ba4d-48f6-98f5-51ebca4adbc5 up in Southbound
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.439 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:98:15 10.100.0.10'], port_security=['fa:16:3e:cf:98:15 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fe8aefc3-96cb-4d4e-a684-1453a7df2fa1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a2f4fffdace4b2fa0e0b6cdfc1055f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '69ff7ffc-62fc-4ff2-b5ba-0e716613e8dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50afd2c5-83ef-4c4d-9a1d-616d6eca472d, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d067fc33-ba4d-48f6-98f5-51ebca4adbc5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.440 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d067fc33-ba4d-48f6-98f5-51ebca4adbc5 in datapath 7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 bound to our chassis#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.443 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.449 187212 DEBUG nova.network.neutron [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Updated VIF entry in instance network info cache for port b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.450 187212 DEBUG nova.network.neutron [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Updating instance_info_cache with network_info: [{"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.453 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[52e09350-476e-46d2-85cb-6062702988cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.454 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7364e4f7-51 in ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.457 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7364e4f7-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.457 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d928020d-85e0-487e-b654-007c0c7b2fa0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.459 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5a5f5b16-e529-4d5c-899c-2c7a0d70935a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.474 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[cbdfc457-35c2-4ae6-852f-e06fe635308b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.491 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c788a826-39f9-49d1-81d5-64c05d0c3a9e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.505 187212 DEBUG oslo_concurrency.lockutils [req-5987b2cf-811a-46d8-baa6-c159e93124b9 req-4d173c10-2671-45d8-bac7-0638dda23139 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.518 187212 DEBUG nova.policy [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b5f1bf811e6c42d699922035de0b538c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '55d3be64e01442ca8f492d2f3e10d1cc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.520 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc8ac7b-61e0-40b7-bc1a-ad546574366d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.524 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.525 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.525 187212 INFO nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Creating image(s)#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.526 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "/var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.526 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "/var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.527 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "/var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:48 np0005546909 NetworkManager[55691]: <info>  [1764936168.5275] manager: (tap7364e4f7-50): new Veth device (/org/freedesktop/NetworkManager/Devices/92)
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.526 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[654af35f-a83a-46df-b4ec-75378276e0d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.548 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936168.5221484, fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.548 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] VM Started (Lifecycle Event)#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.550 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.567 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b69bee7f-e811-43f0-8885-067db3924e3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.572 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[be88aeb8-e09b-497b-a9c1-ac1815feeb99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.582 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.587 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936168.5224235, fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.588 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.596 187212 DEBUG nova.compute.manager [req-0f2a68af-39c4-4b41-9606-c15a7c5e62e1 req-81304521-c01e-40ef-8e59-e56223af1bb7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.596 187212 DEBUG oslo_concurrency.lockutils [req-0f2a68af-39c4-4b41-9606-c15a7c5e62e1 req-81304521-c01e-40ef-8e59-e56223af1bb7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.596 187212 DEBUG oslo_concurrency.lockutils [req-0f2a68af-39c4-4b41-9606-c15a7c5e62e1 req-81304521-c01e-40ef-8e59-e56223af1bb7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.596 187212 DEBUG oslo_concurrency.lockutils [req-0f2a68af-39c4-4b41-9606-c15a7c5e62e1 req-81304521-c01e-40ef-8e59-e56223af1bb7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.597 187212 DEBUG nova.compute.manager [req-0f2a68af-39c4-4b41-9606-c15a7c5e62e1 req-81304521-c01e-40ef-8e59-e56223af1bb7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Processing event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.597 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.602 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:02:48 np0005546909 NetworkManager[55691]: <info>  [1764936168.6038] device (tap7364e4f7-50): carrier: link connected
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.607 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.611 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.612 187212 INFO nova.virt.libvirt.driver [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Instance spawned successfully.#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.613 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.621 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.621 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.622 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.610 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ca3d1f26-325b-409b-b979-df9f2d7f54ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.634 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.641 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f65194e-0e3e-4f86-a2b6-d55647b39633]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7364e4f7-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:e4:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354916, 'reachable_time': 36098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219028, 'error': None, 'target': 'ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.659 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.659 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936168.6013017, bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.660 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.661 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6d508e-c766-4451-b01c-0dc4edd9a2b1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe93:e450'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 354916, 'tstamp': 354916}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219030, 'error': None, 'target': 'ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.669 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.669 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.670 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.670 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.670 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.671 187212 DEBUG nova.virt.libvirt.driver [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.680 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[737b6254-820a-4338-a5d6-9b8103bd58ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7364e4f7-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:e4:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354916, 'reachable_time': 36098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219031, 'error': None, 'target': 'ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.682 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.692 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.699 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.699 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.717 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[333667aa-17ec-4d35-adbf-6afa872e64b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.723 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.735 187212 INFO nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Took 13.12 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.736 187212 DEBUG nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.741 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.743 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.743 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.773 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8541ba2d-f688-4346-a7c2-77f0a6c287ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.774 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7364e4f7-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.775 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.775 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7364e4f7-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.777 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:48 np0005546909 NetworkManager[55691]: <info>  [1764936168.7785] manager: (tap7364e4f7-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Dec  5 07:02:48 np0005546909 kernel: tap7364e4f7-50: entered promiscuous mode
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.780 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.781 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7364e4f7-50, col_values=(('external_ids', {'iface-id': '4371716d-4b21-4191-b690-7541d0a79660'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.783 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:48Z|00219|binding|INFO|Releasing lport 4371716d-4b21-4191-b690-7541d0a79660 from this chassis (sb_readonly=0)
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.784 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.785 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dbcd3118-5806-45e6-9865-702296e75f39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.786 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7.pid.haproxy
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:02:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:48.787 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'env', 'PROCESS_TAG=haproxy-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.797 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.808 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.810 187212 DEBUG nova.virt.disk.api [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Checking if we can resize image /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.810 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.832 187212 INFO nova.compute.manager [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Took 13.63 seconds to build instance.#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.867 187212 DEBUG oslo_concurrency.lockutils [None req-e65f4266-ca54-4d20-bd9e-674c2d0b6e3f 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.884 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.887 187212 DEBUG nova.virt.disk.api [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Cannot resize image /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.888 187212 DEBUG nova.objects.instance [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lazy-loading 'migration_context' on Instance uuid d70544d6-04e3-4b2a-914a-72db3052216a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.900 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.900 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Ensure instance console log exists: /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.901 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.901 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:48 np0005546909 nova_compute[187208]: 2025-12-05 12:02:48.901 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:49 np0005546909 podman[219074]: 2025-12-05 12:02:49.185063239 +0000 UTC m=+0.061894469 container create 3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec  5 07:02:49 np0005546909 systemd[1]: Started libpod-conmon-3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f.scope.
Dec  5 07:02:49 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:02:49 np0005546909 podman[219074]: 2025-12-05 12:02:49.152281202 +0000 UTC m=+0.029112452 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:02:49 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b94271b6f3b062c4a7ae06b3c93c6c47e697dfeccf693d310815cf2eb398a9d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:02:49 np0005546909 podman[219074]: 2025-12-05 12:02:49.281945576 +0000 UTC m=+0.158776806 container init 3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  5 07:02:49 np0005546909 podman[219074]: 2025-12-05 12:02:49.290574492 +0000 UTC m=+0.167405722 container start 3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:02:49 np0005546909 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [NOTICE]   (219094) : New worker (219096) forked
Dec  5 07:02:49 np0005546909 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [NOTICE]   (219094) : Loading success.
Dec  5 07:02:49 np0005546909 nova_compute[187208]: 2025-12-05 12:02:49.448 187212 DEBUG nova.network.neutron [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Successfully created port: 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:02:49 np0005546909 nova_compute[187208]: 2025-12-05 12:02:49.576 187212 DEBUG nova.network.neutron [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Updated VIF entry in instance network info cache for port d067fc33-ba4d-48f6-98f5-51ebca4adbc5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:02:49 np0005546909 nova_compute[187208]: 2025-12-05 12:02:49.576 187212 DEBUG nova.network.neutron [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Updating instance_info_cache with network_info: [{"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:49 np0005546909 nova_compute[187208]: 2025-12-05 12:02:49.604 187212 DEBUG oslo_concurrency.lockutils [req-dd6f544d-549d-44fd-9985-2edf74305e3a req-c1a9c505-d99f-4deb-8331-5674304f508f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:50 np0005546909 nova_compute[187208]: 2025-12-05 12:02:50.154 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936155.1529152, 1606eea3-5389-4437-b0f9-cfe6084d7871 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:50 np0005546909 nova_compute[187208]: 2025-12-05 12:02:50.155 187212 INFO nova.compute.manager [-] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:02:50 np0005546909 nova_compute[187208]: 2025-12-05 12:02:50.181 187212 DEBUG nova.compute.manager [None req-ea223c4b-90e6-4dac-ab6b-16464321c2e0 - - - - - -] [instance: 1606eea3-5389-4437-b0f9-cfe6084d7871] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:50 np0005546909 nova_compute[187208]: 2025-12-05 12:02:50.933 187212 DEBUG nova.network.neutron [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Successfully updated port: 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:02:50 np0005546909 nova_compute[187208]: 2025-12-05 12:02:50.948 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:50 np0005546909 nova_compute[187208]: 2025-12-05 12:02:50.948 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquired lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:50 np0005546909 nova_compute[187208]: 2025-12-05 12:02:50.948 187212 DEBUG nova.network.neutron [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.106 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.106 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.186 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.271 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.272 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.281 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.281 187212 INFO nova.compute.claims [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.488 187212 DEBUG nova.compute.provider_tree [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.508 187212 DEBUG nova.scheduler.client.report [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.532 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.533 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.604 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.605 187212 DEBUG nova.network.neutron [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.626 187212 INFO nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.645 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.652 187212 DEBUG nova.network.neutron [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.747 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.749 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.749 187212 INFO nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Creating image(s)#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.750 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "/var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.751 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.752 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.768 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.827 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.828 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.829 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.841 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.900 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.901 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.936 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.937 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.937 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.995 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.996 187212 DEBUG nova.virt.disk.api [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Checking if we can resize image /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:02:51 np0005546909 nova_compute[187208]: 2025-12-05 12:02:51.997 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.015 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.057 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.058 187212 DEBUG nova.virt.disk.api [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Cannot resize image /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.059 187212 DEBUG nova.objects.instance [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'migration_context' on Instance uuid 2f42f732-65c6-4c4a-9332-47098d7350b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.075 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.076 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Ensure instance console log exists: /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.076 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.076 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.077 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:52 np0005546909 podman[219136]: 2025-12-05 12:02:52.209722672 +0000 UTC m=+0.055034193 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec  5 07:02:52 np0005546909 podman[219135]: 2025-12-05 12:02:52.224038261 +0000 UTC m=+0.074385785 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1755695350, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.222 187212 DEBUG nova.policy [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff425b7b04144f93a2c15e3a347fc15c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4671f6c82ea049fab3a314ecf45b7656', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.234 187212 DEBUG nova.compute.manager [req-6fdcb3bd-49af-4d61-a8f4-ff28194cdb77 req-17907f2d-eb8e-4837-9a25-7412da4b7404 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.235 187212 DEBUG oslo_concurrency.lockutils [req-6fdcb3bd-49af-4d61-a8f4-ff28194cdb77 req-17907f2d-eb8e-4837-9a25-7412da4b7404 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.235 187212 DEBUG oslo_concurrency.lockutils [req-6fdcb3bd-49af-4d61-a8f4-ff28194cdb77 req-17907f2d-eb8e-4837-9a25-7412da4b7404 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.235 187212 DEBUG oslo_concurrency.lockutils [req-6fdcb3bd-49af-4d61-a8f4-ff28194cdb77 req-17907f2d-eb8e-4837-9a25-7412da4b7404 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.235 187212 DEBUG nova.compute.manager [req-6fdcb3bd-49af-4d61-a8f4-ff28194cdb77 req-17907f2d-eb8e-4837-9a25-7412da4b7404 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] No waiting events found dispatching network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.235 187212 WARNING nova.compute.manager [req-6fdcb3bd-49af-4d61-a8f4-ff28194cdb77 req-17907f2d-eb8e-4837-9a25-7412da4b7404 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received unexpected event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff for instance with vm_state active and task_state None.#033[00m
Dec  5 07:02:52 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:52Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:53:25:56 10.100.0.7
Dec  5 07:02:52 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:52Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:53:25:56 10.100.0.7
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.531 187212 DEBUG nova.compute.manager [req-d2b0c678-8723-4e31-b6bc-d7cffd933d17 req-39be9ae0-6fea-4e82-b010-f18ed0752f56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Received event network-vif-plugged-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.532 187212 DEBUG oslo_concurrency.lockutils [req-d2b0c678-8723-4e31-b6bc-d7cffd933d17 req-39be9ae0-6fea-4e82-b010-f18ed0752f56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.532 187212 DEBUG oslo_concurrency.lockutils [req-d2b0c678-8723-4e31-b6bc-d7cffd933d17 req-39be9ae0-6fea-4e82-b010-f18ed0752f56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.532 187212 DEBUG oslo_concurrency.lockutils [req-d2b0c678-8723-4e31-b6bc-d7cffd933d17 req-39be9ae0-6fea-4e82-b010-f18ed0752f56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.532 187212 DEBUG nova.compute.manager [req-d2b0c678-8723-4e31-b6bc-d7cffd933d17 req-39be9ae0-6fea-4e82-b010-f18ed0752f56 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Processing event network-vif-plugged-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.533 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.536 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.540 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936172.5399163, fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.541 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.543 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.547 187212 INFO nova.virt.libvirt.driver [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Instance spawned successfully.#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.547 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.567 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.574 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.577 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.578 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.578 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.578 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.579 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.579 187212 DEBUG nova.virt.libvirt.driver [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.613 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.657 187212 INFO nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Took 13.82 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.658 187212 DEBUG nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.757 187212 INFO nova.compute.manager [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Took 14.37 seconds to build instance.#033[00m
Dec  5 07:02:52 np0005546909 nova_compute[187208]: 2025-12-05 12:02:52.779 187212 DEBUG oslo_concurrency.lockutils [None req-26129322-9edf-46b3-bd39-3333451a9a5d e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.311 187212 DEBUG nova.network.neutron [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Updating instance_info_cache with network_info: [{"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.335 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Releasing lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.335 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Instance network_info: |[{"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.338 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Start _get_guest_xml network_info=[{"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.347 187212 WARNING nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.362 187212 DEBUG nova.virt.libvirt.host [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.363 187212 DEBUG nova.virt.libvirt.host [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.369 187212 DEBUG nova.virt.libvirt.host [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.370 187212 DEBUG nova.virt.libvirt.host [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.371 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.371 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.372 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.372 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.372 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.372 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.373 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.373 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.373 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.374 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.374 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.375 187212 DEBUG nova.virt.hardware [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.380 187212 DEBUG nova.virt.libvirt.vif [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1222437752',display_name='tempest-ImagesOneServerTestJSON-server-1222437752',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1222437752',id=31,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='55d3be64e01442ca8f492d2f3e10d1cc',ramdisk_id='',reservation_id='r-zbitw7u9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-1350277374',owner_user_name='tempest-ImagesOneServerTestJSON-1350277374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:48Z,user_data=None,user_id='b5f1bf811e6c42d699922035de0b538c',uuid=d70544d6-04e3-4b2a-914a-72db3052216a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.380 187212 DEBUG nova.network.os_vif_util [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Converting VIF {"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.382 187212 DEBUG nova.network.os_vif_util [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.383 187212 DEBUG nova.objects.instance [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lazy-loading 'pci_devices' on Instance uuid d70544d6-04e3-4b2a-914a-72db3052216a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.399 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  <uuid>d70544d6-04e3-4b2a-914a-72db3052216a</uuid>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  <name>instance-0000001f</name>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <nova:name>tempest-ImagesOneServerTestJSON-server-1222437752</nova:name>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:02:53</nova:creationTime>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:        <nova:user uuid="b5f1bf811e6c42d699922035de0b538c">tempest-ImagesOneServerTestJSON-1350277374-project-member</nova:user>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:        <nova:project uuid="55d3be64e01442ca8f492d2f3e10d1cc">tempest-ImagesOneServerTestJSON-1350277374</nova:project>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:        <nova:port uuid="99a1ab7f-bf64-4cc9-846c-9748ff4a93dc">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <entry name="serial">d70544d6-04e3-4b2a-914a-72db3052216a</entry>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <entry name="uuid">d70544d6-04e3-4b2a-914a-72db3052216a</entry>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.config"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:a9:8a:0c"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <target dev="tap99a1ab7f-bf"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/console.log" append="off"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:02:53 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:02:53 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:02:53 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:02:53 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.400 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Preparing to wait for external event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.409 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.410 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.410 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.411 187212 DEBUG nova.virt.libvirt.vif [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1222437752',display_name='tempest-ImagesOneServerTestJSON-server-1222437752',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1222437752',id=31,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='55d3be64e01442ca8f492d2f3e10d1cc',ramdisk_id='',reservation_id='r-zbitw7u9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-1350277374',owner_user_name='tempest-ImagesOneServerTestJSON-1350277374-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:48Z,user_data=None,user_id='b5f1bf811e6c42d699922035de0b538c',uuid=d70544d6-04e3-4b2a-914a-72db3052216a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.411 187212 DEBUG nova.network.os_vif_util [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Converting VIF {"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.412 187212 DEBUG nova.network.os_vif_util [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.414 187212 DEBUG os_vif [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.415 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.416 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.416 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.421 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.422 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99a1ab7f-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.422 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap99a1ab7f-bf, col_values=(('external_ids', {'iface-id': '99a1ab7f-bf64-4cc9-846c-9748ff4a93dc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:8a:0c', 'vm-uuid': 'd70544d6-04e3-4b2a-914a-72db3052216a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.424 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:53 np0005546909 NetworkManager[55691]: <info>  [1764936173.4253] manager: (tap99a1ab7f-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.427 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.431 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.432 187212 INFO os_vif [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf')#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.491 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.491 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.491 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] No VIF found with MAC fa:16:3e:a9:8a:0c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.492 187212 INFO nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Using config drive#033[00m
Dec  5 07:02:53 np0005546909 nova_compute[187208]: 2025-12-05 12:02:53.549 187212 DEBUG nova.network.neutron [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Successfully created port: b785a426-63ba-453e-95dc-3aa63f9f75a9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.005 187212 INFO nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Creating config drive at /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.config#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.011 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpautor3nt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.138 187212 DEBUG oslo_concurrency.processutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpautor3nt" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:02:54 np0005546909 kernel: tap99a1ab7f-bf: entered promiscuous mode
Dec  5 07:02:54 np0005546909 NetworkManager[55691]: <info>  [1764936174.2070] manager: (tap99a1ab7f-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Dec  5 07:02:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:54Z|00220|binding|INFO|Claiming lport 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc for this chassis.
Dec  5 07:02:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:54Z|00221|binding|INFO|99a1ab7f-bf64-4cc9-846c-9748ff4a93dc: Claiming fa:16:3e:a9:8a:0c 10.100.0.12
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.213 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.224 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:8a:0c 10.100.0.12'], port_security=['fa:16:3e:a9:8a:0c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd70544d6-04e3-4b2a-914a-72db3052216a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39702279-01de-4f4b-bc33-58c8c6f673e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55d3be64e01442ca8f492d2f3e10d1cc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7da5af47-2519-44c3-bc78-6f5347e93e10', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94b6aed5-905a-43ff-81d8-6adfe368f476, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.227 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc in datapath 39702279-01de-4f4b-bc33-58c8c6f673e3 bound to our chassis#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.230 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 39702279-01de-4f4b-bc33-58c8c6f673e3#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.241 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d80db8c2-899f-4621-a519-f023130ccca3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.243 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap39702279-01 in ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.247 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap39702279-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.247 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[46b13f17-7080-4e82-a12f-3c68257176d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.248 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bedc39ee-fc8c-4e2b-b129-49004f3dae3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 systemd-udevd[219189]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.263 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[55952aac-c8db-4a11-83c1-632ca82dd112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 NetworkManager[55691]: <info>  [1764936174.2667] device (tap99a1ab7f-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:02:54 np0005546909 NetworkManager[55691]: <info>  [1764936174.2678] device (tap99a1ab7f-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.277 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:54Z|00222|binding|INFO|Setting lport 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc ovn-installed in OVS
Dec  5 07:02:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:54Z|00223|binding|INFO|Setting lport 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc up in Southbound
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.282 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0dff8c46-75b0-479f-bcf1-acee21c915c0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.283 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:54 np0005546909 systemd-machined[153543]: New machine qemu-35-instance-0000001f.
Dec  5 07:02:54 np0005546909 systemd[1]: Started Virtual Machine qemu-35-instance-0000001f.
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.324 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7b6184b2-4f5a-4509-a25e-ff5b6193a066]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.330 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6c8cc81d-9054-419a-a04a-f53df8548b48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 NetworkManager[55691]: <info>  [1764936174.3314] manager: (tap39702279-00): new Veth device (/org/freedesktop/NetworkManager/Devices/96)
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.373 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c5207d-663e-4fef-a6b9-a80ffa994d94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.376 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[66677081-35c9-421e-90a2-446541e14336]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 NetworkManager[55691]: <info>  [1764936174.3996] device (tap39702279-00): carrier: link connected
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.404 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[de805b68-ca91-49a8-9a6f-9de202e99489]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.441 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0dafb4e8-3a1a-4e40-a85c-7e01f6214745]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap39702279-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:7b:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355496, 'reachable_time': 44993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219224, 'error': None, 'target': 'ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.454 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[92ce0d99-77d3-4e6b-af35-a6131441989a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:7bef'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 355496, 'tstamp': 355496}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219227, 'error': None, 'target': 'ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.475 187212 DEBUG nova.virt.libvirt.driver [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.476 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[548a3c49-21fd-44d9-94df-4acb28392592]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap39702279-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:7b:ef'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355496, 'reachable_time': 44993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219231, 'error': None, 'target': 'ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.506 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[31ea6568-2900-40a8-a973-992d609bf4d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.552 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6e5a02-f83d-4ff5-b906-f83fe94c2a2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.554 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39702279-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.554 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.554 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap39702279-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:54 np0005546909 NetworkManager[55691]: <info>  [1764936174.5581] manager: (tap39702279-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Dec  5 07:02:54 np0005546909 kernel: tap39702279-00: entered promiscuous mode
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.557 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.560 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap39702279-00, col_values=(('external_ids', {'iface-id': '55380907-78ff-4f14-8b9a-7ccb714bf36a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:54Z|00224|binding|INFO|Releasing lport 55380907-78ff-4f14-8b9a-7ccb714bf36a from this chassis (sb_readonly=0)
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.578 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/39702279-01de-4f4b-bc33-58c8c6f673e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/39702279-01de-4f4b-bc33-58c8c6f673e3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.579 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43d38983-b2d1-43be-9394-5098de7b5f5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.580 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-39702279-01de-4f4b-bc33-58c8c6f673e3
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/39702279-01de-4f4b-bc33-58c8c6f673e3.pid.haproxy
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 39702279-01de-4f4b-bc33-58c8c6f673e3
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:02:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:54.580 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3', 'env', 'PROCESS_TAG=haproxy-39702279-01de-4f4b-bc33-58c8c6f673e3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/39702279-01de-4f4b-bc33-58c8c6f673e3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.725 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936174.7248533, d70544d6-04e3-4b2a-914a-72db3052216a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.725 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] VM Started (Lifecycle Event)#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.747 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.752 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936174.725062, d70544d6-04e3-4b2a-914a-72db3052216a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.752 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.775 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.777 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:02:54 np0005546909 nova_compute[187208]: 2025-12-05 12:02:54.803 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:02:54 np0005546909 podman[219264]: 2025-12-05 12:02:54.982128251 +0000 UTC m=+0.053423917 container create 004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:02:55 np0005546909 systemd[1]: Started libpod-conmon-004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff.scope.
Dec  5 07:02:55 np0005546909 podman[219264]: 2025-12-05 12:02:54.95688135 +0000 UTC m=+0.028177046 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:02:55 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:02:55 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc8314cf85594aa36784fe5dbf1012ba087261d9468b0e3431f9bb9c756a87d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:02:55 np0005546909 podman[219264]: 2025-12-05 12:02:55.072623306 +0000 UTC m=+0.143919002 container init 004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  5 07:02:55 np0005546909 podman[219264]: 2025-12-05 12:02:55.077644229 +0000 UTC m=+0.148939895 container start 004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:02:55 np0005546909 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [NOTICE]   (219283) : New worker (219285) forked
Dec  5 07:02:55 np0005546909 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [NOTICE]   (219283) : Loading success.
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.160 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.161 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.161 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.161 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.162 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.163 187212 INFO nova.compute.manager [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Terminating instance#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.166 187212 DEBUG nova.compute.manager [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:02:55 np0005546909 kernel: tapd067fc33-ba (unregistering): left promiscuous mode
Dec  5 07:02:55 np0005546909 NetworkManager[55691]: <info>  [1764936175.1925] device (tapd067fc33-ba): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:02:55 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:55Z|00225|binding|INFO|Releasing lport d067fc33-ba4d-48f6-98f5-51ebca4adbc5 from this chassis (sb_readonly=0)
Dec  5 07:02:55 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:55Z|00226|binding|INFO|Setting lport d067fc33-ba4d-48f6-98f5-51ebca4adbc5 down in Southbound
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.198 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:55 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:55Z|00227|binding|INFO|Removing iface tapd067fc33-ba ovn-installed in OVS
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.205 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:98:15 10.100.0.10'], port_security=['fa:16:3e:cf:98:15 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fe8aefc3-96cb-4d4e-a684-1453a7df2fa1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a2f4fffdace4b2fa0e0b6cdfc1055f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '69ff7ffc-62fc-4ff2-b5ba-0e716613e8dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=50afd2c5-83ef-4c4d-9a1d-616d6eca472d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d067fc33-ba4d-48f6-98f5-51ebca4adbc5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.206 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d067fc33-ba4d-48f6-98f5-51ebca4adbc5 in datapath 7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 unbound from our chassis#033[00m
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.208 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.209 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc6030a-3ca7-48a2-a7f2-851ac5d1fffb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.210 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 namespace which is not needed anymore#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.209 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.251 187212 DEBUG nova.compute.manager [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-changed-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:55 np0005546909 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Dec  5 07:02:55 np0005546909 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d0000001e.scope: Consumed 2.909s CPU time.
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.252 187212 DEBUG nova.compute.manager [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Refreshing instance network info cache due to event network-changed-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.253 187212 DEBUG oslo_concurrency.lockutils [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.254 187212 DEBUG oslo_concurrency.lockutils [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.255 187212 DEBUG nova.network.neutron [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Refreshing network info cache for port 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:02:55 np0005546909 systemd-machined[153543]: Machine qemu-34-instance-0000001e terminated.
Dec  5 07:02:55 np0005546909 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [NOTICE]   (219094) : haproxy version is 2.8.14-c23fe91
Dec  5 07:02:55 np0005546909 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [NOTICE]   (219094) : path to executable is /usr/sbin/haproxy
Dec  5 07:02:55 np0005546909 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [WARNING]  (219094) : Exiting Master process...
Dec  5 07:02:55 np0005546909 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [ALERT]    (219094) : Current worker (219096) exited with code 143 (Terminated)
Dec  5 07:02:55 np0005546909 neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7[219090]: [WARNING]  (219094) : All workers exited. Exiting... (0)
Dec  5 07:02:55 np0005546909 systemd[1]: libpod-3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f.scope: Deactivated successfully.
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.343 187212 DEBUG nova.compute.manager [req-d736bc97-81a4-4528-b0e0-cf468cac5d52 req-8a6f5e44-6023-4892-a546-9a1890f3d47a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Received event network-vif-plugged-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.344 187212 DEBUG oslo_concurrency.lockutils [req-d736bc97-81a4-4528-b0e0-cf468cac5d52 req-8a6f5e44-6023-4892-a546-9a1890f3d47a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.345 187212 DEBUG oslo_concurrency.lockutils [req-d736bc97-81a4-4528-b0e0-cf468cac5d52 req-8a6f5e44-6023-4892-a546-9a1890f3d47a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:55 np0005546909 podman[219314]: 2025-12-05 12:02:55.345689656 +0000 UTC m=+0.048206798 container died 3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.345 187212 DEBUG oslo_concurrency.lockutils [req-d736bc97-81a4-4528-b0e0-cf468cac5d52 req-8a6f5e44-6023-4892-a546-9a1890f3d47a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.346 187212 DEBUG nova.compute.manager [req-d736bc97-81a4-4528-b0e0-cf468cac5d52 req-8a6f5e44-6023-4892-a546-9a1890f3d47a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] No waiting events found dispatching network-vif-plugged-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.346 187212 WARNING nova.compute.manager [req-d736bc97-81a4-4528-b0e0-cf468cac5d52 req-8a6f5e44-6023-4892-a546-9a1890f3d47a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Received unexpected event network-vif-plugged-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:02:55 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f-userdata-shm.mount: Deactivated successfully.
Dec  5 07:02:55 np0005546909 systemd[1]: var-lib-containers-storage-overlay-3b94271b6f3b062c4a7ae06b3c93c6c47e697dfeccf693d310815cf2eb398a9d-merged.mount: Deactivated successfully.
Dec  5 07:02:55 np0005546909 podman[219314]: 2025-12-05 12:02:55.392472582 +0000 UTC m=+0.094989714 container cleanup 3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Dec  5 07:02:55 np0005546909 systemd[1]: libpod-conmon-3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f.scope: Deactivated successfully.
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.461 187212 INFO nova.virt.libvirt.driver [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Instance destroyed successfully.#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.462 187212 DEBUG nova.objects.instance [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lazy-loading 'resources' on Instance uuid fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.477 187212 DEBUG nova.virt.libvirt.vif [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:02:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1938885940',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1938885940',id=30,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:02:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6a2f4fffdace4b2fa0e0b6cdfc1055f5',ramdisk_id='',reservation_id='r-n0y57b39',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsV221TestJSON-44253202',owner_user_name='tempest-InstanceActionsV221TestJSON-44253202-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:02:52Z,user_data=None,user_id='e4a0640c63a14775b62a4d40c4860519',uuid=fe8aefc3-96cb-4d4e-a684-1453a7df2fa1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.477 187212 DEBUG nova.network.os_vif_util [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Converting VIF {"id": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "address": "fa:16:3e:cf:98:15", "network": {"id": "7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1340728484-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6a2f4fffdace4b2fa0e0b6cdfc1055f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd067fc33-ba", "ovs_interfaceid": "d067fc33-ba4d-48f6-98f5-51ebca4adbc5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.479 187212 DEBUG nova.network.os_vif_util [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.479 187212 DEBUG os_vif [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.482 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.482 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd067fc33-ba, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.485 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.488 187212 INFO os_vif [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:98:15,bridge_name='br-int',has_traffic_filtering=True,id=d067fc33-ba4d-48f6-98f5-51ebca4adbc5,network=Network(7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd067fc33-ba')#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.489 187212 INFO nova.virt.libvirt.driver [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Deleting instance files /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1_del#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.489 187212 INFO nova.virt.libvirt.driver [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Deletion of /var/lib/nova/instances/fe8aefc3-96cb-4d4e-a684-1453a7df2fa1_del complete#033[00m
Dec  5 07:02:55 np0005546909 podman[219352]: 2025-12-05 12:02:55.501264009 +0000 UTC m=+0.085915345 container remove 3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.509 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[19513d5b-3614-4192-8da7-8e0718f680d7]: (4, ('Fri Dec  5 12:02:55 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 (3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f)\n3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f\nFri Dec  5 12:02:55 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 (3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f)\n3e1c8a39e4c42fdc288cec3aff648253206a81e375d39c3f0a60c5aaf81a400f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.514 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ccc435bc-7e02-4831-80a2-b193c78bf3ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.515 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7364e4f7-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.517 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:55 np0005546909 kernel: tap7364e4f7-50: left promiscuous mode
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.519 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.521 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d84e912a-e46b-4934-aa97-5ff975cd9c8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.544 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6f3f72f5-a5bf-4e15-be97-e3fbdd6433c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.545 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6deec074-4923-46ba-b989-e70ba7f8ed68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.551 187212 INFO nova.compute.manager [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.552 187212 DEBUG oslo.service.loopingcall [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.553 187212 DEBUG nova.compute.manager [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:02:55 np0005546909 nova_compute[187208]: 2025-12-05 12:02:55.553 187212 DEBUG nova.network.neutron [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.571 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5733b884-37b3-4df7-82e4-0bff7789446d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354907, 'reachable_time': 17533, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219372, 'error': None, 'target': 'ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.573 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7364e4f7-59f6-4a8e-95d7-dd1efe1ab7f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:02:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:55.574 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[eb3de0c1-2c55-4469-9e9d-230fd3b54725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:55 np0005546909 systemd[1]: run-netns-ovnmeta\x2d7364e4f7\x2d59f6\x2d4a8e\x2d95d7\x2ddd1efe1ab7f7.mount: Deactivated successfully.
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.523 187212 DEBUG nova.network.neutron [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Updated VIF entry in instance network info cache for port 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.523 187212 DEBUG nova.network.neutron [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Updating instance_info_cache with network_info: [{"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.547 187212 DEBUG oslo_concurrency.lockutils [req-dcfdd684-10ab-4533-8d3d-d56777041ff5 req-0d16a23d-5a0d-469f-b462-706fba64b33f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.565 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:56 np0005546909 NetworkManager[55691]: <info>  [1764936176.5694] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Dec  5 07:02:56 np0005546909 NetworkManager[55691]: <info>  [1764936176.5706] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.598 187212 DEBUG nova.network.neutron [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Successfully updated port: b785a426-63ba-453e-95dc-3aa63f9f75a9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.615 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "refresh_cache-2f42f732-65c6-4c4a-9332-47098d7350b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.615 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquired lock "refresh_cache-2f42f732-65c6-4c4a-9332-47098d7350b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.615 187212 DEBUG nova.network.neutron [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.675 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:56Z|00228|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:02:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:56Z|00229|binding|INFO|Releasing lport 55380907-78ff-4f14-8b9a-7ccb714bf36a from this chassis (sb_readonly=0)
Dec  5 07:02:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:56Z|00230|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.697 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:56 np0005546909 kernel: tap82089bf4-20 (unregistering): left promiscuous mode
Dec  5 07:02:56 np0005546909 NetworkManager[55691]: <info>  [1764936176.8273] device (tap82089bf4-20): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.832 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:56Z|00231|binding|INFO|Releasing lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb from this chassis (sb_readonly=0)
Dec  5 07:02:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:56Z|00232|binding|INFO|Setting lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb down in Southbound
Dec  5 07:02:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:56Z|00233|binding|INFO|Removing iface tap82089bf4-20 ovn-installed in OVS
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.834 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:56.845 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:25:56 10.100.0.7'], port_security=['fa:16:3e:53:25:56 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9efa988a-19ae-440a-8a56-0bac68cb3c9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=82089bf4-207e-4880-b8ff-9bf09a4ac3fb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:56.846 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 82089bf4-207e-4880-b8ff-9bf09a4ac3fb in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis#033[00m
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.847 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:56.849 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:02:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:56.850 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bdac87a9-0e95-4272-aa40-7b45ffc1b5be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:56.851 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace which is not needed anymore#033[00m
Dec  5 07:02:56 np0005546909 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Dec  5 07:02:56 np0005546909 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000001c.scope: Consumed 12.859s CPU time.
Dec  5 07:02:56 np0005546909 systemd-machined[153543]: Machine qemu-32-instance-0000001c terminated.
Dec  5 07:02:56 np0005546909 nova_compute[187208]: 2025-12-05 12:02:56.967 187212 DEBUG nova.network.neutron [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:02:56 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [NOTICE]   (218801) : haproxy version is 2.8.14-c23fe91
Dec  5 07:02:56 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [NOTICE]   (218801) : path to executable is /usr/sbin/haproxy
Dec  5 07:02:56 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [WARNING]  (218801) : Exiting Master process...
Dec  5 07:02:56 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [ALERT]    (218801) : Current worker (218803) exited with code 143 (Terminated)
Dec  5 07:02:56 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[218797]: [WARNING]  (218801) : All workers exited. Exiting... (0)
Dec  5 07:02:56 np0005546909 systemd[1]: libpod-5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5.scope: Deactivated successfully.
Dec  5 07:02:56 np0005546909 podman[219395]: 2025-12-05 12:02:56.985037561 +0000 UTC m=+0.052556492 container died 5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.014 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:57 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5-userdata-shm.mount: Deactivated successfully.
Dec  5 07:02:57 np0005546909 systemd[1]: var-lib-containers-storage-overlay-df6dbe72f381b99c1a982824974be4cad9c714499ad9f3e0b3b9579fdd54f357-merged.mount: Deactivated successfully.
Dec  5 07:02:57 np0005546909 podman[219395]: 2025-12-05 12:02:57.043149521 +0000 UTC m=+0.110668442 container cleanup 5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec  5 07:02:57 np0005546909 systemd[1]: libpod-conmon-5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5.scope: Deactivated successfully.
Dec  5 07:02:57 np0005546909 kernel: tap82089bf4-20: entered promiscuous mode
Dec  5 07:02:57 np0005546909 systemd-udevd[219214]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:02:57 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:57Z|00234|binding|INFO|Claiming lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb for this chassis.
Dec  5 07:02:57 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:57Z|00235|binding|INFO|82089bf4-207e-4880-b8ff-9bf09a4ac3fb: Claiming fa:16:3e:53:25:56 10.100.0.7
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.068 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:57 np0005546909 NetworkManager[55691]: <info>  [1764936177.0690] manager: (tap82089bf4-20): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Dec  5 07:02:57 np0005546909 kernel: tap82089bf4-20 (unregistering): left promiscuous mode
Dec  5 07:02:57 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:57Z|00236|binding|INFO|Setting lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb ovn-installed in OVS
Dec  5 07:02:57 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:57Z|00237|if_status|INFO|Not setting lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb down as sb is readonly
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.093 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:57 np0005546909 podman[219428]: 2025-12-05 12:02:57.121832179 +0000 UTC m=+0.050334399 container remove 5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.128 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e72e3c02-bd24-4cd5-8ddb-701b9b13ba56]: (4, ('Fri Dec  5 12:02:56 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5)\n5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5\nFri Dec  5 12:02:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5)\n5b2f6a29f461749c918de056035b9dc72bdb98226152910b45cdde25844847a5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.130 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6da58949-93d3-402a-9e02-09819381cb68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.131 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.132 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:57 np0005546909 kernel: tap41b3b495-c0: left promiscuous mode
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.147 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.151 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3c22d8e7-e5e0-4739-adb9-6827387750e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.167 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ad222c77-6f28-4769-bca3-a140830b9082]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.169 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f44ac454-f43f-4e82-9dfb-022e03f1b6c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.182 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c8e6e4f4-d7d8-478c-bbe6-6ce850318ff2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353779, 'reachable_time': 38401, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219456, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.184 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.184 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a80faedf-a9b1-4ac9-add0-ca34fa20d15f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 systemd[1]: run-netns-ovnmeta\x2d41b3b495\x2dc1c9\x2d44c0\x2db1a3\x2da499df6548dd.mount: Deactivated successfully.
Dec  5 07:02:57 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:57Z|00238|binding|INFO|Releasing lport 82089bf4-207e-4880-b8ff-9bf09a4ac3fb from this chassis (sb_readonly=0)
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.323 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:25:56 10.100.0.7'], port_security=['fa:16:3e:53:25:56 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9efa988a-19ae-440a-8a56-0bac68cb3c9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=82089bf4-207e-4880-b8ff-9bf09a4ac3fb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.324 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 82089bf4-207e-4880-b8ff-9bf09a4ac3fb in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.326 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.339 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.339 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:25:56 10.100.0.7'], port_security=['fa:16:3e:53:25:56 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '9efa988a-19ae-440a-8a56-0bac68cb3c9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=82089bf4-207e-4880-b8ff-9bf09a4ac3fb) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.339 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[662291bc-93ff-47e2-85bc-66e3b1fd1c39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.341 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41b3b495-c1 in ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.343 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41b3b495-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.343 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c4cfd6a1-8840-4e1d-8df4-4f76d7aeb5d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.343 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6039f0b5-d917-4cb1-9478-65be7cc93ea2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.356 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[6deb6055-8941-4fdc-bb60-091e4cd60cf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.368 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0db11547-e025-46ab-b8ef-96982e732441]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.397 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[164d9ff9-ee75-499a-9267-5a38201ad610]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.432 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3088a0d2-9057-436a-8a6d-42ff8ee8bcef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 NetworkManager[55691]: <info>  [1764936177.4339] manager: (tap41b3b495-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/101)
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.469 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9f7fff7f-b1d6-419c-9b4f-ba2d50b3c635]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.472 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[aec30d88-f539-4d9c-ad36-51abcc5be7c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 systemd-udevd[219515]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:02:57 np0005546909 NetworkManager[55691]: <info>  [1764936177.4951] device (tap41b3b495-c0): carrier: link connected
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.503 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1abeb798-bd34-43c1-a2c5-27693604031f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.505 187212 INFO nova.virt.libvirt.driver [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance shutdown successfully after 13 seconds.#033[00m
Dec  5 07:02:57 np0005546909 podman[219459]: 2025-12-05 12:02:57.509847812 +0000 UTC m=+0.131809296 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.510 187212 INFO nova.virt.libvirt.driver [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance destroyed successfully.#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.510 187212 DEBUG nova.objects.instance [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'numa_topology' on Instance uuid 9efa988a-19ae-440a-8a56-0bac68cb3c9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:57 np0005546909 podman[219461]: 2025-12-05 12:02:57.517268504 +0000 UTC m=+0.135237444 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.518 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[61711658-da08-4de3-a0bd-41c68bcaff7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355805, 'reachable_time': 21360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219535, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.527 187212 DEBUG nova.compute.manager [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.533 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e3b5e546-53fa-4051-be52-c9e09cbdaa92]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:a102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 355805, 'tstamp': 355805}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219536, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.534 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.549 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[123090fc-2700-46b1-96ec-03fb22432a3a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355805, 'reachable_time': 21360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219537, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.571 187212 DEBUG oslo_concurrency.lockutils [None req-d665e5d0-d988-4b25-a78d-3b91c3a41480 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.389s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.573 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4fcccf-64e4-4057-9a89-dde768a2a304]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.627 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eb664645-d0b5-49fe-8679-32f6b2697995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.629 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.629 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.629 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.631 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:57 np0005546909 NetworkManager[55691]: <info>  [1764936177.6323] manager: (tap41b3b495-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Dec  5 07:02:57 np0005546909 kernel: tap41b3b495-c0: entered promiscuous mode
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.636 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.637 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.637 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:57 np0005546909 ovn_controller[95610]: 2025-12-05T12:02:57Z|00239|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.655 187212 DEBUG nova.compute.manager [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received event network-changed-b785a426-63ba-453e-95dc-3aa63f9f75a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.655 187212 DEBUG nova.compute.manager [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Refreshing instance network info cache due to event network-changed-b785a426-63ba-453e-95dc-3aa63f9f75a9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.655 187212 DEBUG oslo_concurrency.lockutils [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2f42f732-65c6-4c4a-9332-47098d7350b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.656 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.657 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.657 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c7b386-6f02-44f7-a5d9-a85daafd08de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.658 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:02:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:57.659 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'env', 'PROCESS_TAG=haproxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41b3b495-c1c9-44c0-b1a3-a499df6548dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.967 187212 DEBUG nova.network.neutron [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:57 np0005546909 nova_compute[187208]: 2025-12-05 12:02:57.989 187212 INFO nova.compute.manager [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Took 2.44 seconds to deallocate network for instance.#033[00m
Dec  5 07:02:58 np0005546909 podman[219572]: 2025-12-05 12:02:58.000245399 +0000 UTC m=+0.054998062 container create 8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:02:58 np0005546909 systemd[1]: Started libpod-conmon-8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24.scope.
Dec  5 07:02:58 np0005546909 nova_compute[187208]: 2025-12-05 12:02:58.056 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:58 np0005546909 nova_compute[187208]: 2025-12-05 12:02:58.057 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:58 np0005546909 podman[219572]: 2025-12-05 12:02:57.971505858 +0000 UTC m=+0.026258531 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:02:58 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:02:58 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e4710660d98a71ae1e5a3da48dbc488f07c954d543d7bb420eaeff7c7de543/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:02:58 np0005546909 podman[219572]: 2025-12-05 12:02:58.098704751 +0000 UTC m=+0.153457414 container init 8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:02:58 np0005546909 podman[219572]: 2025-12-05 12:02:58.104910339 +0000 UTC m=+0.159662982 container start 8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:02:58 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [NOTICE]   (219591) : New worker (219593) forked
Dec  5 07:02:58 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [NOTICE]   (219591) : Loading success.
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.170 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 82089bf4-207e-4880-b8ff-9bf09a4ac3fb in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis#033[00m
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.173 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.174 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c04d690b-14a1-4dc6-a604-8ae881e3e4cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.176 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace which is not needed anymore#033[00m
Dec  5 07:02:58 np0005546909 nova_compute[187208]: 2025-12-05 12:02:58.215 187212 DEBUG nova.compute.provider_tree [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:02:58 np0005546909 nova_compute[187208]: 2025-12-05 12:02:58.230 187212 DEBUG nova.scheduler.client.report [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:02:58 np0005546909 nova_compute[187208]: 2025-12-05 12:02:58.258 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:58 np0005546909 nova_compute[187208]: 2025-12-05 12:02:58.283 187212 INFO nova.scheduler.client.report [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Deleted allocations for instance fe8aefc3-96cb-4d4e-a684-1453a7df2fa1#033[00m
Dec  5 07:02:58 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [NOTICE]   (219591) : haproxy version is 2.8.14-c23fe91
Dec  5 07:02:58 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [NOTICE]   (219591) : path to executable is /usr/sbin/haproxy
Dec  5 07:02:58 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [WARNING]  (219591) : Exiting Master process...
Dec  5 07:02:58 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [WARNING]  (219591) : Exiting Master process...
Dec  5 07:02:58 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [ALERT]    (219591) : Current worker (219593) exited with code 143 (Terminated)
Dec  5 07:02:58 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[219587]: [WARNING]  (219591) : All workers exited. Exiting... (0)
Dec  5 07:02:58 np0005546909 systemd[1]: libpod-8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24.scope: Deactivated successfully.
Dec  5 07:02:58 np0005546909 podman[219619]: 2025-12-05 12:02:58.316971306 +0000 UTC m=+0.049465094 container died 8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:02:58 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24-userdata-shm.mount: Deactivated successfully.
Dec  5 07:02:58 np0005546909 systemd[1]: var-lib-containers-storage-overlay-47e4710660d98a71ae1e5a3da48dbc488f07c954d543d7bb420eaeff7c7de543-merged.mount: Deactivated successfully.
Dec  5 07:02:58 np0005546909 podman[219619]: 2025-12-05 12:02:58.356548665 +0000 UTC m=+0.089042463 container cleanup 8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:02:58 np0005546909 nova_compute[187208]: 2025-12-05 12:02:58.363 187212 DEBUG oslo_concurrency.lockutils [None req-3d592639-af3e-4546-b405-c0a63b7721ed e4a0640c63a14775b62a4d40c4860519 6a2f4fffdace4b2fa0e0b6cdfc1055f5 - - default default] Lock "fe8aefc3-96cb-4d4e-a684-1453a7df2fa1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:58 np0005546909 systemd[1]: libpod-conmon-8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24.scope: Deactivated successfully.
Dec  5 07:02:58 np0005546909 podman[219649]: 2025-12-05 12:02:58.414446769 +0000 UTC m=+0.037838132 container remove 8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.419 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[380d1215-a7f4-433a-a2a2-79359fd95416]: (4, ('Fri Dec  5 12:02:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24)\n8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24\nFri Dec  5 12:02:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24)\n8db6f840ce46990b8cc7d8556eb13691dcdb5c05cc9dd14f4a1e769c99d6af24\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.422 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed96107c-2bbb-4659-8baa-678042bdc084]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.423 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:58 np0005546909 nova_compute[187208]: 2025-12-05 12:02:58.425 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:58 np0005546909 kernel: tap41b3b495-c0: left promiscuous mode
Dec  5 07:02:58 np0005546909 nova_compute[187208]: 2025-12-05 12:02:58.441 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.445 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72758f80-0660-4d67-b3cc-ea6d12dfaf02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.463 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5709e252-b7e7-4a26-8591-7b76f2b04f48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.464 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[739d74f7-6031-45e8-a624-88e6067c0dd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.477 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9e86514e-37ba-417d-859e-3f0321402d6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355795, 'reachable_time': 39271, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219667, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.480 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:02:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:02:58.480 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[afc492e0-00d7-4029-bfd9-34a73c2f1a9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:02:58 np0005546909 systemd[1]: run-netns-ovnmeta\x2d41b3b495\x2dc1c9\x2d44c0\x2db1a3\x2da499df6548dd.mount: Deactivated successfully.
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.016 187212 DEBUG nova.network.neutron [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Updating instance_info_cache with network_info: [{"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.046 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Releasing lock "refresh_cache-2f42f732-65c6-4c4a-9332-47098d7350b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.047 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Instance network_info: |[{"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.047 187212 DEBUG oslo_concurrency.lockutils [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2f42f732-65c6-4c4a-9332-47098d7350b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.048 187212 DEBUG nova.network.neutron [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Refreshing network info cache for port b785a426-63ba-453e-95dc-3aa63f9f75a9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.051 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Start _get_guest_xml network_info=[{"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.056 187212 WARNING nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.072 187212 DEBUG nova.virt.libvirt.host [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.073 187212 DEBUG nova.virt.libvirt.host [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.082 187212 DEBUG nova.virt.libvirt.host [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.083 187212 DEBUG nova.virt.libvirt.host [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.084 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.084 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.084 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.085 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.085 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.085 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.085 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.085 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.085 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.086 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.086 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.086 187212 DEBUG nova.virt.hardware [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.090 187212 DEBUG nova.virt.libvirt.vif [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1246678959',display_name='tempest-DeleteServersTestJSON-server-1246678959',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1246678959',id=32,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-1e9h9ypl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:51Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=2f42f732-65c6-4c4a-9332-47098d7350b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.091 187212 DEBUG nova.network.os_vif_util [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.091 187212 DEBUG nova.network.os_vif_util [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.093 187212 DEBUG nova.objects.instance [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2f42f732-65c6-4c4a-9332-47098d7350b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.109 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  <uuid>2f42f732-65c6-4c4a-9332-47098d7350b9</uuid>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  <name>instance-00000020</name>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <nova:name>tempest-DeleteServersTestJSON-server-1246678959</nova:name>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:02:59</nova:creationTime>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:        <nova:user uuid="ff425b7b04144f93a2c15e3a347fc15c">tempest-DeleteServersTestJSON-554028480-project-member</nova:user>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:        <nova:project uuid="4671f6c82ea049fab3a314ecf45b7656">tempest-DeleteServersTestJSON-554028480</nova:project>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:        <nova:port uuid="b785a426-63ba-453e-95dc-3aa63f9f75a9">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <entry name="serial">2f42f732-65c6-4c4a-9332-47098d7350b9</entry>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <entry name="uuid">2f42f732-65c6-4c4a-9332-47098d7350b9</entry>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.config"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:4e:26:bd"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <target dev="tapb785a426-63"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/console.log" append="off"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:02:59 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:02:59 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:02:59 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:02:59 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.109 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Preparing to wait for external event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.109 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.110 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.110 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.111 187212 DEBUG nova.virt.libvirt.vif [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:02:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1246678959',display_name='tempest-DeleteServersTestJSON-server-1246678959',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1246678959',id=32,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-1e9h9ypl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:02:51Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=2f42f732-65c6-4c4a-9332-47098d7350b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.111 187212 DEBUG nova.network.os_vif_util [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.111 187212 DEBUG nova.network.os_vif_util [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.112 187212 DEBUG os_vif [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.112 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.113 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.113 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.116 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.116 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb785a426-63, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.117 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb785a426-63, col_values=(('external_ids', {'iface-id': 'b785a426-63ba-453e-95dc-3aa63f9f75a9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:26:bd', 'vm-uuid': '2f42f732-65c6-4c4a-9332-47098d7350b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.118 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:59 np0005546909 NetworkManager[55691]: <info>  [1764936179.1197] manager: (tapb785a426-63): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.123 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.128 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.129 187212 INFO os_vif [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63')#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.187 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.188 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.188 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No VIF found with MAC fa:16:3e:4e:26:bd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.189 187212 INFO nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Using config drive#033[00m
Dec  5 07:02:59 np0005546909 nova_compute[187208]: 2025-12-05 12:02:59.196 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.161 187212 INFO nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Creating config drive at /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.config#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.166 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmskh5vso execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.290 187212 DEBUG oslo_concurrency.processutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmskh5vso" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:00 np0005546909 NetworkManager[55691]: <info>  [1764936180.3430] manager: (tapb785a426-63): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Dec  5 07:03:00 np0005546909 systemd-udevd[219533]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:03:00 np0005546909 kernel: tapb785a426-63: entered promiscuous mode
Dec  5 07:03:00 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:00Z|00240|binding|INFO|Claiming lport b785a426-63ba-453e-95dc-3aa63f9f75a9 for this chassis.
Dec  5 07:03:00 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:00Z|00241|binding|INFO|b785a426-63ba-453e-95dc-3aa63f9f75a9: Claiming fa:16:3e:4e:26:bd 10.100.0.9
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.347 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:00 np0005546909 NetworkManager[55691]: <info>  [1764936180.3626] device (tapb785a426-63): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:03:00 np0005546909 NetworkManager[55691]: <info>  [1764936180.3640] device (tapb785a426-63): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:03:00 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:00Z|00242|binding|INFO|Setting lport b785a426-63ba-453e-95dc-3aa63f9f75a9 ovn-installed in OVS
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.375 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.381 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:00 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:00Z|00243|binding|INFO|Setting lport b785a426-63ba-453e-95dc-3aa63f9f75a9 up in Southbound
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.419 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:26:bd 10.100.0.9'], port_security=['fa:16:3e:4e:26:bd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2f42f732-65c6-4c4a-9332-47098d7350b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b785a426-63ba-453e-95dc-3aa63f9f75a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.420 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b785a426-63ba-453e-95dc-3aa63f9f75a9 in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 bound to our chassis#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.423 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.433 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fe9b87fb-8f17-496f-834f-cb3214b3ddd6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.434 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7360f84-b1 in ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.436 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7360f84-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.436 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b719c047-b8e1-4b60-8ee5-07f2457a1483]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.437 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e5fdbe6a-1291-4a30-9184-17bc18e50145]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.447 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[39ac8c3e-c7e4-4880-8062-9b3e6d1ec5a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.460 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f596888-25e9-446e-9d56-79368c4d300a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 systemd[1]: Started Virtual Machine qemu-36-instance-00000020.
Dec  5 07:03:00 np0005546909 systemd-machined[153543]: New machine qemu-36-instance-00000020.
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.498 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[53f077f3-abda-4c06-abca-1c6c76375e69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 NetworkManager[55691]: <info>  [1764936180.5051] manager: (tapd7360f84-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/105)
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.503 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[027a8405-71dd-4769-9926-f1f96c947a67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 systemd-udevd[219744]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.551 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[50625570-e2ae-4889-ae6b-bd23c5cef478]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.555 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[64413629-bf9e-46b2-b8e5-46fa7d4e7058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 NetworkManager[55691]: <info>  [1764936180.5782] device (tapd7360f84-b0): carrier: link connected
Dec  5 07:03:00 np0005546909 podman[219718]: 2025-12-05 12:03:00.581697263 +0000 UTC m=+0.104719852 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.584 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3828a0a1-b3c6-4ded-ae08-b07c7de5aa70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.602 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[898d3411-2677-440b-83b8-fd1b44db997c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 356114, 'reachable_time': 43986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219770, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.616 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[999acab2-f81a-46f1-885e-3b21dd5f3594]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:2b52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 356114, 'tstamp': 356114}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219771, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.632 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[833385d3-be34-4b3c-b869-103a13012601]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 356114, 'reachable_time': 43986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219772, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.664 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a6959e2-fb9d-4703-9ae9-80d0fa3f6597]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.716 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d299b329-87f3-460a-9e16-8638d73fde87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.717 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.717 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.717 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7360f84-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.719 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:00 np0005546909 NetworkManager[55691]: <info>  [1764936180.7201] manager: (tapd7360f84-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Dec  5 07:03:00 np0005546909 kernel: tapd7360f84-b0: entered promiscuous mode
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.727 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7360f84-b0, col_values=(('external_ids', {'iface-id': 'd85bc323-c3ce-47e3-ac1f-d5f27467a4e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.731 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:00 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:00Z|00244|binding|INFO|Releasing lport d85bc323-c3ce-47e3-ac1f-d5f27467a4e9 from this chassis (sb_readonly=0)
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.745 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.746 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.747 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6afb1ff5-bb7f-4213-8e63-1a1ccd7c4eee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.749 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:03:00 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:00.749 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'env', 'PROCESS_TAG=haproxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.779 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936180.7791884, 2f42f732-65c6-4c4a-9332-47098d7350b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.781 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] VM Started (Lifecycle Event)#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.805 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.810 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936180.7792728, 2f42f732-65c6-4c4a-9332-47098d7350b9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.811 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.827 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.830 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:00 np0005546909 nova_compute[187208]: 2025-12-05 12:03:00.845 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:01 np0005546909 podman[219816]: 2025-12-05 12:03:01.1049533 +0000 UTC m=+0.051759000 container create ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:03:01 np0005546909 systemd[1]: Started libpod-conmon-ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3.scope.
Dec  5 07:03:01 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:03:01 np0005546909 podman[219816]: 2025-12-05 12:03:01.076766544 +0000 UTC m=+0.023572044 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:03:01 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85ca59d6a34c179f3b81febc919aaa04cecc14ab33813fa899326030a0893116/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:03:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:01Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2e:33:fe 10.100.0.9
Dec  5 07:03:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:01Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2e:33:fe 10.100.0.9
Dec  5 07:03:01 np0005546909 podman[219816]: 2025-12-05 12:03:01.371550895 +0000 UTC m=+0.318356415 container init ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:03:01 np0005546909 podman[219816]: 2025-12-05 12:03:01.377858235 +0000 UTC m=+0.324663725 container start ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.410 187212 DEBUG nova.compute.manager [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-changed-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.410 187212 DEBUG nova.compute.manager [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Refreshing instance network info cache due to event network-changed-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.411 187212 DEBUG oslo_concurrency.lockutils [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.411 187212 DEBUG oslo_concurrency.lockutils [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.411 187212 DEBUG nova.network.neutron [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Refreshing network info cache for port b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:03:01 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [NOTICE]   (219834) : New worker (219836) forked
Dec  5 07:03:01 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [NOTICE]   (219834) : Loading success.
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.807 187212 DEBUG nova.compute.manager [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.830 187212 DEBUG nova.compute.manager [req-8d5e8daa-9abe-4534-9ccc-056f70222e33 req-17abcdb6-95e1-496f-b0db-855f5ec5a6aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.830 187212 DEBUG oslo_concurrency.lockutils [req-8d5e8daa-9abe-4534-9ccc-056f70222e33 req-17abcdb6-95e1-496f-b0db-855f5ec5a6aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.830 187212 DEBUG oslo_concurrency.lockutils [req-8d5e8daa-9abe-4534-9ccc-056f70222e33 req-17abcdb6-95e1-496f-b0db-855f5ec5a6aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.831 187212 DEBUG oslo_concurrency.lockutils [req-8d5e8daa-9abe-4534-9ccc-056f70222e33 req-17abcdb6-95e1-496f-b0db-855f5ec5a6aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.831 187212 DEBUG nova.compute.manager [req-8d5e8daa-9abe-4534-9ccc-056f70222e33 req-17abcdb6-95e1-496f-b0db-855f5ec5a6aa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Processing event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.831 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.834 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936181.8345957, 2f42f732-65c6-4c4a-9332-47098d7350b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.835 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.836 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.839 187212 INFO nova.virt.libvirt.driver [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Instance spawned successfully.#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.839 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.879 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.881 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.882 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.882 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.882 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.883 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.883 187212 DEBUG nova.virt.libvirt.driver [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.887 187212 INFO nova.compute.manager [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] instance snapshotting#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.887 187212 WARNING nova.compute.manager [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] trying to snapshot a non-running instance: (state: 4 expected: 1)#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.889 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.929 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.981 187212 INFO nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Took 10.23 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:03:01 np0005546909 nova_compute[187208]: 2025-12-05 12:03:01.981 187212 DEBUG nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:02 np0005546909 nova_compute[187208]: 2025-12-05 12:03:02.073 187212 INFO nova.compute.manager [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Took 10.82 seconds to build instance.#033[00m
Dec  5 07:03:02 np0005546909 nova_compute[187208]: 2025-12-05 12:03:02.088 187212 DEBUG nova.network.neutron [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Updated VIF entry in instance network info cache for port b785a426-63ba-453e-95dc-3aa63f9f75a9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:03:02 np0005546909 nova_compute[187208]: 2025-12-05 12:03:02.088 187212 DEBUG nova.network.neutron [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Updating instance_info_cache with network_info: [{"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:02 np0005546909 nova_compute[187208]: 2025-12-05 12:03:02.092 187212 DEBUG oslo_concurrency.lockutils [None req-a2f9f7d6-825c-4bfb-a190-290c33eb9fe9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.986s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:02 np0005546909 nova_compute[187208]: 2025-12-05 12:03:02.115 187212 DEBUG oslo_concurrency.lockutils [req-73daf3ec-4e23-439a-814d-8dbc690ead22 req-2e7f6435-3d7d-4524-a936-d79f0bb524c7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2f42f732-65c6-4c4a-9332-47098d7350b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:02 np0005546909 nova_compute[187208]: 2025-12-05 12:03:02.207 187212 INFO nova.virt.libvirt.driver [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Beginning cold snapshot process#033[00m
Dec  5 07:03:02 np0005546909 nova_compute[187208]: 2025-12-05 12:03:02.434 187212 DEBUG nova.privsep.utils [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:03:02 np0005546909 nova_compute[187208]: 2025-12-05 12:03:02.435 187212 DEBUG oslo_concurrency.processutils [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk /var/lib/nova/instances/snapshots/tmpv7bua9ay/621f542e4f4a4ada9eb46bd90c685ad0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:02 np0005546909 nova_compute[187208]: 2025-12-05 12:03:02.536 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:02 np0005546909 nova_compute[187208]: 2025-12-05 12:03:02.727 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:02 np0005546909 nova_compute[187208]: 2025-12-05 12:03:02.834 187212 DEBUG oslo_concurrency.processutils [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e/disk /var/lib/nova/instances/snapshots/tmpv7bua9ay/621f542e4f4a4ada9eb46bd90c685ad0" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:02 np0005546909 nova_compute[187208]: 2025-12-05 12:03:02.835 187212 INFO nova.virt.libvirt.driver [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:03:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:03.010 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:03.011 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:03.012 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:03 np0005546909 nova_compute[187208]: 2025-12-05 12:03:03.714 187212 DEBUG nova.network.neutron [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Updated VIF entry in instance network info cache for port b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:03:03 np0005546909 nova_compute[187208]: 2025-12-05 12:03:03.715 187212 DEBUG nova.network.neutron [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Updating instance_info_cache with network_info: [{"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:03 np0005546909 nova_compute[187208]: 2025-12-05 12:03:03.741 187212 DEBUG oslo_concurrency.lockutils [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:03 np0005546909 nova_compute[187208]: 2025-12-05 12:03:03.742 187212 DEBUG nova.compute.manager [req-932ad7f5-338a-4bfe-a5cf-0728288f67c9 req-c3218f1e-5b37-4db8-9bf3-187e70873b66 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Received event network-vif-deleted-d067fc33-ba4d-48f6-98f5-51ebca4adbc5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.168 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.251 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.252 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.252 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.253 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.253 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Processing event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.253 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.253 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.253 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.254 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.254 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] No waiting events found dispatching network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.254 187212 WARNING nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received unexpected event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.254 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received event network-vif-unplugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.254 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.254 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.255 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.255 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] No waiting events found dispatching network-vif-unplugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.255 187212 WARNING nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received unexpected event network-vif-unplugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb for instance with vm_state stopped and task_state image_uploading.#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.255 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.255 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.256 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.256 187212 DEBUG oslo_concurrency.lockutils [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.256 187212 DEBUG nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] No waiting events found dispatching network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.256 187212 WARNING nova.compute.manager [req-c3af8250-65e9-4d81-9522-1a46e69bae92 req-81da759f-019e-4760-add8-0d9397af8664 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received unexpected event network-vif-plugged-82089bf4-207e-4880-b8ff-9bf09a4ac3fb for instance with vm_state stopped and task_state image_uploading.#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.258 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Instance event wait completed in 9 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.263 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.264 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936184.2641354, d70544d6-04e3-4b2a-914a-72db3052216a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.264 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.271 187212 INFO nova.virt.libvirt.driver [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Instance spawned successfully.#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.271 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.290 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.293 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.304 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.305 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.306 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.306 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.306 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.307 187212 DEBUG nova.virt.libvirt.driver [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.321 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.386 187212 INFO nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Took 15.86 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.387 187212 DEBUG nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.503 187212 INFO nova.compute.manager [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Took 16.81 seconds to build instance.#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.563 187212 DEBUG oslo_concurrency.lockutils [None req-8675b6e2-d377-4d1e-a9d3-56f253ff7486 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.564 187212 DEBUG nova.compute.manager [req-b2dc3830-ba7d-4f47-8dec-2f427eed1cf5 req-fc568c5c-9eae-4c7b-9511-f53d435d658a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.565 187212 DEBUG oslo_concurrency.lockutils [req-b2dc3830-ba7d-4f47-8dec-2f427eed1cf5 req-fc568c5c-9eae-4c7b-9511-f53d435d658a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.565 187212 DEBUG oslo_concurrency.lockutils [req-b2dc3830-ba7d-4f47-8dec-2f427eed1cf5 req-fc568c5c-9eae-4c7b-9511-f53d435d658a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.565 187212 DEBUG oslo_concurrency.lockutils [req-b2dc3830-ba7d-4f47-8dec-2f427eed1cf5 req-fc568c5c-9eae-4c7b-9511-f53d435d658a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.565 187212 DEBUG nova.compute.manager [req-b2dc3830-ba7d-4f47-8dec-2f427eed1cf5 req-fc568c5c-9eae-4c7b-9511-f53d435d658a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] No waiting events found dispatching network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.566 187212 WARNING nova.compute.manager [req-b2dc3830-ba7d-4f47-8dec-2f427eed1cf5 req-fc568c5c-9eae-4c7b-9511-f53d435d658a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received unexpected event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:03:04 np0005546909 nova_compute[187208]: 2025-12-05 12:03:04.610 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.043 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.043 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.053 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.054 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.054 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.054 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.055 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.056 187212 INFO nova.compute.manager [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Terminating instance#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.057 187212 DEBUG nova.compute.manager [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.066 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:03:05 np0005546909 kernel: tapb785a426-63 (unregistering): left promiscuous mode
Dec  5 07:03:05 np0005546909 NetworkManager[55691]: <info>  [1764936185.0834] device (tapb785a426-63): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:03:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:05Z|00245|binding|INFO|Releasing lport b785a426-63ba-453e-95dc-3aa63f9f75a9 from this chassis (sb_readonly=0)
Dec  5 07:03:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:05Z|00246|binding|INFO|Setting lport b785a426-63ba-453e-95dc-3aa63f9f75a9 down in Southbound
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.091 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:05Z|00247|binding|INFO|Removing iface tapb785a426-63 ovn-installed in OVS
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.092 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.102 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:26:bd 10.100.0.9'], port_security=['fa:16:3e:4e:26:bd 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2f42f732-65c6-4c4a-9332-47098d7350b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b785a426-63ba-453e-95dc-3aa63f9f75a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.103 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b785a426-63ba-453e-95dc-3aa63f9f75a9 in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 unbound from our chassis#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.105 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.107 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[29c48909-5bcb-499b-8ba4-f41f925353ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.107 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace which is not needed anymore#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:05 np0005546909 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000020.scope: Deactivated successfully.
Dec  5 07:03:05 np0005546909 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000020.scope: Consumed 3.541s CPU time.
Dec  5 07:03:05 np0005546909 systemd-machined[153543]: Machine qemu-36-instance-00000020 terminated.
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.167 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.168 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.177 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.178 187212 INFO nova.compute.claims [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:03:05 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [NOTICE]   (219834) : haproxy version is 2.8.14-c23fe91
Dec  5 07:03:05 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [NOTICE]   (219834) : path to executable is /usr/sbin/haproxy
Dec  5 07:03:05 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [WARNING]  (219834) : Exiting Master process...
Dec  5 07:03:05 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [ALERT]    (219834) : Current worker (219836) exited with code 143 (Terminated)
Dec  5 07:03:05 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[219830]: [WARNING]  (219834) : All workers exited. Exiting... (0)
Dec  5 07:03:05 np0005546909 systemd[1]: libpod-ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3.scope: Deactivated successfully.
Dec  5 07:03:05 np0005546909 podman[219880]: 2025-12-05 12:03:05.247432842 +0000 UTC m=+0.050471983 container died ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:03:05 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3-userdata-shm.mount: Deactivated successfully.
Dec  5 07:03:05 np0005546909 systemd[1]: var-lib-containers-storage-overlay-85ca59d6a34c179f3b81febc919aaa04cecc14ab33813fa899326030a0893116-merged.mount: Deactivated successfully.
Dec  5 07:03:05 np0005546909 podman[219880]: 2025-12-05 12:03:05.29184194 +0000 UTC m=+0.094881061 container cleanup ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:03:05 np0005546909 systemd[1]: libpod-conmon-ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3.scope: Deactivated successfully.
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.333 187212 INFO nova.virt.libvirt.driver [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Instance destroyed successfully.#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.335 187212 DEBUG nova.objects.instance [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'resources' on Instance uuid 2f42f732-65c6-4c4a-9332-47098d7350b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.350 187212 DEBUG nova.virt.libvirt.vif [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:02:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1246678959',display_name='tempest-DeleteServersTestJSON-server-1246678959',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1246678959',id=32,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-1e9h9ypl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:02Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=2f42f732-65c6-4c4a-9332-47098d7350b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.351 187212 DEBUG nova.network.os_vif_util [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "address": "fa:16:3e:4e:26:bd", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb785a426-63", "ovs_interfaceid": "b785a426-63ba-453e-95dc-3aa63f9f75a9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.351 187212 DEBUG nova.network.os_vif_util [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.352 187212 DEBUG os_vif [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.353 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.354 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb785a426-63, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.364 187212 INFO nova.virt.libvirt.driver [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Snapshot image upload complete#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.365 187212 INFO nova.compute.manager [None req-aff4e03d-01d9-4074-b7f5-4661a0731e40 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Took 3.48 seconds to snapshot the instance on the hypervisor.#033[00m
Dec  5 07:03:05 np0005546909 podman[219921]: 2025-12-05 12:03:05.36607631 +0000 UTC m=+0.051016048 container remove ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.371 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.372 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[811f5e61-d239-4e98-801e-529142779747]: (4, ('Fri Dec  5 12:03:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3)\nba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3\nFri Dec  5 12:03:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (ba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3)\nba3df224f1156d64e1923b60e3371a78d0dcf58d06f44dbf0bbc24952e1fdfc3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.373 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.374 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a29da469-e358-4869-922d-5661f131d5bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.375 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.375 187212 INFO os_vif [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:26:bd,bridge_name='br-int',has_traffic_filtering=True,id=b785a426-63ba-453e-95dc-3aa63f9f75a9,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb785a426-63')#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.376 187212 INFO nova.virt.libvirt.driver [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Deleting instance files /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9_del#033[00m
Dec  5 07:03:05 np0005546909 kernel: tapd7360f84-b0: left promiscuous mode
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.376 187212 INFO nova.virt.libvirt.driver [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Deletion of /var/lib/nova/instances/2f42f732-65c6-4c4a-9332-47098d7350b9_del complete#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.378 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.393 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.396 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[23132262-e872-4766-ac60-9c2af0ff5d4c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.397 187212 DEBUG nova.compute.provider_tree [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.421 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c0c190-8495-4f1a-b31c-6e75c73fc957]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.422 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b3c34ee1-fa11-44fe-b865-717b4c250121]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.431 187212 DEBUG nova.scheduler.client.report [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.436 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[476b3b1c-34d4-4807-bef0-bdad5cd61db2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 356105, 'reachable_time': 17391, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219946, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:05 np0005546909 systemd[1]: run-netns-ovnmeta\x2dd7360f84\x2dbcd5\x2d4e64\x2dbf43\x2d1fdbd8215a70.mount: Deactivated successfully.
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.442 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:03:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:05.442 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[860e3b71-f644-4c2c-a6c1-0b60d8faadf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.442 187212 INFO nova.compute.manager [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.443 187212 DEBUG oslo.service.loopingcall [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.443 187212 DEBUG nova.compute.manager [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.443 187212 DEBUG nova.network.neutron [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.456 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.457 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.497 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.497 187212 DEBUG nova.network.neutron [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.514 187212 INFO nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.529 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.627 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.629 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.629 187212 INFO nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Creating image(s)#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.629 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "/var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.630 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "/var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.631 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "/var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.644 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.704 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.705 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.706 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.718 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.779 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.780 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.818 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.819 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.820 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.896 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.900 187212 DEBUG nova.virt.disk.api [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Checking if we can resize image /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.901 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.927 187212 DEBUG nova.policy [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.963 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.964 187212 DEBUG nova.virt.disk.api [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Cannot resize image /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.964 187212 DEBUG nova.objects.instance [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'migration_context' on Instance uuid d2085dd9-2ebd-4804-99c1-3b15cbd216f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.982 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.983 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Ensure instance console log exists: /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.983 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.984 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:05 np0005546909 nova_compute[187208]: 2025-12-05 12:03:05.984 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:07 np0005546909 nova_compute[187208]: 2025-12-05 12:03:07.539 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:07 np0005546909 nova_compute[187208]: 2025-12-05 12:03:07.587 187212 DEBUG nova.network.neutron [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:07 np0005546909 nova_compute[187208]: 2025-12-05 12:03:07.615 187212 INFO nova.compute.manager [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Took 2.17 seconds to deallocate network for instance.#033[00m
Dec  5 07:03:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:07Z|00248|binding|INFO|Releasing lport 55380907-78ff-4f14-8b9a-7ccb714bf36a from this chassis (sb_readonly=0)
Dec  5 07:03:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:07Z|00249|binding|INFO|Releasing lport 4f5e3c8a-5273-4414-820c-16ae051153f4 from this chassis (sb_readonly=0)
Dec  5 07:03:07 np0005546909 nova_compute[187208]: 2025-12-05 12:03:07.659 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:07 np0005546909 nova_compute[187208]: 2025-12-05 12:03:07.661 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:07Z|00250|binding|INFO|Removing iface tap82089bf4-20 ovn-installed in OVS
Dec  5 07:03:07 np0005546909 nova_compute[187208]: 2025-12-05 12:03:07.687 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:07 np0005546909 nova_compute[187208]: 2025-12-05 12:03:07.729 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:07 np0005546909 nova_compute[187208]: 2025-12-05 12:03:07.832 187212 DEBUG nova.compute.provider_tree [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:07 np0005546909 nova_compute[187208]: 2025-12-05 12:03:07.853 187212 DEBUG nova.scheduler.client.report [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:07 np0005546909 nova_compute[187208]: 2025-12-05 12:03:07.882 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:07 np0005546909 nova_compute[187208]: 2025-12-05 12:03:07.908 187212 INFO nova.scheduler.client.report [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Deleted allocations for instance 2f42f732-65c6-4c4a-9332-47098d7350b9#033[00m
Dec  5 07:03:07 np0005546909 nova_compute[187208]: 2025-12-05 12:03:07.991 187212 DEBUG oslo_concurrency.lockutils [None req-ce42c3f2-dc6a-4780-928e-0e10d87b5e4b ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.937s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:08 np0005546909 nova_compute[187208]: 2025-12-05 12:03:08.085 187212 DEBUG nova.network.neutron [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Successfully created port: 0a11e563-2be9-4ce9-af51-7d29b586e233 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:03:08 np0005546909 podman[219962]: 2025-12-05 12:03:08.20646988 +0000 UTC m=+0.058903503 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:03:08 np0005546909 nova_compute[187208]: 2025-12-05 12:03:08.357 187212 DEBUG nova.compute.manager [req-6925f8cb-8728-464d-9320-1ee04e5f32e4 req-274efb12-8de8-4e77-94c8-a14fb5f987f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received event network-vif-unplugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:08 np0005546909 nova_compute[187208]: 2025-12-05 12:03:08.359 187212 DEBUG oslo_concurrency.lockutils [req-6925f8cb-8728-464d-9320-1ee04e5f32e4 req-274efb12-8de8-4e77-94c8-a14fb5f987f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:08 np0005546909 nova_compute[187208]: 2025-12-05 12:03:08.359 187212 DEBUG oslo_concurrency.lockutils [req-6925f8cb-8728-464d-9320-1ee04e5f32e4 req-274efb12-8de8-4e77-94c8-a14fb5f987f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:08 np0005546909 nova_compute[187208]: 2025-12-05 12:03:08.359 187212 DEBUG oslo_concurrency.lockutils [req-6925f8cb-8728-464d-9320-1ee04e5f32e4 req-274efb12-8de8-4e77-94c8-a14fb5f987f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:08 np0005546909 nova_compute[187208]: 2025-12-05 12:03:08.360 187212 DEBUG nova.compute.manager [req-6925f8cb-8728-464d-9320-1ee04e5f32e4 req-274efb12-8de8-4e77-94c8-a14fb5f987f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] No waiting events found dispatching network-vif-unplugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:08 np0005546909 nova_compute[187208]: 2025-12-05 12:03:08.360 187212 WARNING nova.compute.manager [req-6925f8cb-8728-464d-9320-1ee04e5f32e4 req-274efb12-8de8-4e77-94c8-a14fb5f987f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received unexpected event network-vif-unplugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:03:08 np0005546909 nova_compute[187208]: 2025-12-05 12:03:08.692 187212 DEBUG nova.compute.manager [req-1a510430-9571-4c81-b112-eb6985dd49cd req-00a56ab8-e4ee-488f-b129-5b7634a09313 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received event network-vif-deleted-b785a426-63ba-453e-95dc-3aa63f9f75a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:09 np0005546909 nova_compute[187208]: 2025-12-05 12:03:09.272 187212 DEBUG nova.compute.manager [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:09 np0005546909 nova_compute[187208]: 2025-12-05 12:03:09.322 187212 INFO nova.compute.manager [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] instance snapshotting#033[00m
Dec  5 07:03:09 np0005546909 nova_compute[187208]: 2025-12-05 12:03:09.583 187212 INFO nova.virt.libvirt.driver [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Beginning live snapshot process#033[00m
Dec  5 07:03:09 np0005546909 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec  5 07:03:09 np0005546909 nova_compute[187208]: 2025-12-05 12:03:09.739 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:09 np0005546909 nova_compute[187208]: 2025-12-05 12:03:09.808 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:09 np0005546909 nova_compute[187208]: 2025-12-05 12:03:09.809 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:09 np0005546909 nova_compute[187208]: 2025-12-05 12:03:09.878 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:09 np0005546909 nova_compute[187208]: 2025-12-05 12:03:09.891 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:09 np0005546909 nova_compute[187208]: 2025-12-05 12:03:09.953 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:09 np0005546909 nova_compute[187208]: 2025-12-05 12:03:09.955 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpop97y61o/8d979427add1471fae85232b1fc23868.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:09 np0005546909 nova_compute[187208]: 2025-12-05 12:03:09.993 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpop97y61o/8d979427add1471fae85232b1fc23868.delta 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:09 np0005546909 nova_compute[187208]: 2025-12-05 12:03:09.995 187212 INFO nova.virt.libvirt.driver [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.057 187212 DEBUG nova.virt.libvirt.guest [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.062 187212 INFO nova.virt.libvirt.driver [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.274 187212 DEBUG nova.privsep.utils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.276 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpop97y61o/8d979427add1471fae85232b1fc23868.delta /var/lib/nova/instances/snapshots/tmpop97y61o/8d979427add1471fae85232b1fc23868 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.432 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936175.431474, fe8aefc3-96cb-4d4e-a684-1453a7df2fa1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.434 187212 INFO nova.compute.manager [-] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.455 187212 DEBUG nova.compute.manager [None req-787eb86c-dcfc-42f5-aa36-29c5c3821099 - - - - - -] [instance: fe8aefc3-96cb-4d4e-a684-1453a7df2fa1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.604 187212 DEBUG oslo_concurrency.processutils [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpop97y61o/8d979427add1471fae85232b1fc23868.delta /var/lib/nova/instances/snapshots/tmpop97y61o/8d979427add1471fae85232b1fc23868" returned: 0 in 0.328s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.606 187212 INFO nova.virt.libvirt.driver [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.655 187212 DEBUG nova.network.neutron [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Successfully updated port: 0a11e563-2be9-4ce9-af51-7d29b586e233 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.679 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "refresh_cache-d2085dd9-2ebd-4804-99c1-3b15cbd216f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.680 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquired lock "refresh_cache-d2085dd9-2ebd-4804-99c1-3b15cbd216f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:10 np0005546909 nova_compute[187208]: 2025-12-05 12:03:10.680 187212 DEBUG nova.network.neutron [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:03:11 np0005546909 nova_compute[187208]: 2025-12-05 12:03:11.019 187212 DEBUG nova.network.neutron [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.062 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.118 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936177.117115, 9efa988a-19ae-440a-8a56-0bac68cb3c9e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.119 187212 INFO nova.compute.manager [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.147 187212 DEBUG nova.compute.manager [None req-2707a026-2eec-4940-9471-39323d98146d - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.152 187212 DEBUG nova.compute.manager [None req-2707a026-2eec-4940-9471-39323d98146d - - - - - -] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.515 187212 DEBUG nova.compute.manager [req-a9987a7f-1c3a-4f42-906e-a8c5ac1d3deb req-d28184d8-47df-4fd1-9cb1-c55beddab3a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.516 187212 DEBUG oslo_concurrency.lockutils [req-a9987a7f-1c3a-4f42-906e-a8c5ac1d3deb req-d28184d8-47df-4fd1-9cb1-c55beddab3a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.516 187212 DEBUG oslo_concurrency.lockutils [req-a9987a7f-1c3a-4f42-906e-a8c5ac1d3deb req-d28184d8-47df-4fd1-9cb1-c55beddab3a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.516 187212 DEBUG oslo_concurrency.lockutils [req-a9987a7f-1c3a-4f42-906e-a8c5ac1d3deb req-d28184d8-47df-4fd1-9cb1-c55beddab3a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2f42f732-65c6-4c4a-9332-47098d7350b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.516 187212 DEBUG nova.compute.manager [req-a9987a7f-1c3a-4f42-906e-a8c5ac1d3deb req-d28184d8-47df-4fd1-9cb1-c55beddab3a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] No waiting events found dispatching network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.517 187212 WARNING nova.compute.manager [req-a9987a7f-1c3a-4f42-906e-a8c5ac1d3deb req-d28184d8-47df-4fd1-9cb1-c55beddab3a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Received unexpected event network-vif-plugged-b785a426-63ba-453e-95dc-3aa63f9f75a9 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.541 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.574 187212 DEBUG nova.network.neutron [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Updating instance_info_cache with network_info: [{"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.605 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Releasing lock "refresh_cache-d2085dd9-2ebd-4804-99c1-3b15cbd216f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.605 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Instance network_info: |[{"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.608 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Start _get_guest_xml network_info=[{"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.611 187212 WARNING nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.619 187212 DEBUG nova.virt.libvirt.host [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.620 187212 DEBUG nova.virt.libvirt.host [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.623 187212 DEBUG nova.virt.libvirt.host [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.623 187212 DEBUG nova.virt.libvirt.host [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.624 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.624 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.625 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.625 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.626 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.626 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.626 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.627 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.627 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.627 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.628 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.628 187212 DEBUG nova.virt.hardware [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.632 187212 DEBUG nova.virt.libvirt.vif [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-835443144',display_name='tempest-ImagesOneServerNegativeTestJSON-server-835443144',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-835443144',id=33,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-ijey2289',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:05Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=d2085dd9-2ebd-4804-99c1-3b15cbd216f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.632 187212 DEBUG nova.network.os_vif_util [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.634 187212 DEBUG nova.network.os_vif_util [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.635 187212 DEBUG nova.objects.instance [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'pci_devices' on Instance uuid d2085dd9-2ebd-4804-99c1-3b15cbd216f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.650 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  <uuid>d2085dd9-2ebd-4804-99c1-3b15cbd216f8</uuid>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  <name>instance-00000021</name>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-835443144</nova:name>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:03:12</nova:creationTime>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:        <nova:user uuid="3ee170bdfdd343189ee1da01bdb80be6">tempest-ImagesOneServerNegativeTestJSON-661137252-project-member</nova:user>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:        <nova:project uuid="79895287bd1d488c842f6013729a1f81">tempest-ImagesOneServerNegativeTestJSON-661137252</nova:project>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:        <nova:port uuid="0a11e563-2be9-4ce9-af51-7d29b586e233">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <entry name="serial">d2085dd9-2ebd-4804-99c1-3b15cbd216f8</entry>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <entry name="uuid">d2085dd9-2ebd-4804-99c1-3b15cbd216f8</entry>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.config"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:f2:70:f2"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <target dev="tap0a11e563-2b"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/console.log" append="off"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:03:12 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:03:12 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:03:12 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:03:12 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.651 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Preparing to wait for external event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.651 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.652 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.652 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.652 187212 DEBUG nova.virt.libvirt.vif [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-835443144',display_name='tempest-ImagesOneServerNegativeTestJSON-server-835443144',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-835443144',id=33,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-ijey2289',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:05Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=d2085dd9-2ebd-4804-99c1-3b15cbd216f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.653 187212 DEBUG nova.network.os_vif_util [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.653 187212 DEBUG nova.network.os_vif_util [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.654 187212 DEBUG os_vif [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.654 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.655 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.655 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.657 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.657 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a11e563-2b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.658 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0a11e563-2b, col_values=(('external_ids', {'iface-id': '0a11e563-2be9-4ce9-af51-7d29b586e233', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:70:f2', 'vm-uuid': 'd2085dd9-2ebd-4804-99c1-3b15cbd216f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.659 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:12 np0005546909 NetworkManager[55691]: <info>  [1764936192.6607] manager: (tap0a11e563-2b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.661 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.668 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.669 187212 INFO os_vif [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b')#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.717 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.717 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.718 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No VIF found with MAC fa:16:3e:f2:70:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.718 187212 INFO nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Using config drive#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.720 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.720 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.720 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.721 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.721 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.722 187212 INFO nova.compute.manager [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Terminating instance#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.723 187212 DEBUG nova.compute.manager [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.727 187212 INFO nova.virt.libvirt.driver [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Instance destroyed successfully.#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.728 187212 DEBUG nova.objects.instance [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'resources' on Instance uuid 9efa988a-19ae-440a-8a56-0bac68cb3c9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.746 187212 DEBUG nova.virt.libvirt.vif [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:02:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1705515982',display_name='tempest-ImagesTestJSON-server-1705515982',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1705515982',id=28,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:02:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-9ph9qh0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:05Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=9efa988a-19ae-440a-8a56-0bac68cb3c9e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.746 187212 DEBUG nova.network.os_vif_util [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "address": "fa:16:3e:53:25:56", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82089bf4-20", "ovs_interfaceid": "82089bf4-207e-4880-b8ff-9bf09a4ac3fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.747 187212 DEBUG nova.network.os_vif_util [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.748 187212 DEBUG os_vif [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.749 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.750 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82089bf4-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.751 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.756 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.758 187212 INFO os_vif [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:25:56,bridge_name='br-int',has_traffic_filtering=True,id=82089bf4-207e-4880-b8ff-9bf09a4ac3fb,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82089bf4-20')#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.759 187212 INFO nova.virt.libvirt.driver [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Deleting instance files /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e_del#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.767 187212 INFO nova.virt.libvirt.driver [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Deletion of /var/lib/nova/instances/9efa988a-19ae-440a-8a56-0bac68cb3c9e_del complete#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.811 187212 DEBUG nova.compute.manager [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-changed-0a11e563-2be9-4ce9-af51-7d29b586e233 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.811 187212 DEBUG nova.compute.manager [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Refreshing instance network info cache due to event network-changed-0a11e563-2be9-4ce9-af51-7d29b586e233. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.811 187212 DEBUG oslo_concurrency.lockutils [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-d2085dd9-2ebd-4804-99c1-3b15cbd216f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.812 187212 DEBUG oslo_concurrency.lockutils [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-d2085dd9-2ebd-4804-99c1-3b15cbd216f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.812 187212 DEBUG nova.network.neutron [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Refreshing network info cache for port 0a11e563-2be9-4ce9-af51-7d29b586e233 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.833 187212 INFO nova.compute.manager [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Took 0.11 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.834 187212 DEBUG oslo.service.loopingcall [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.835 187212 DEBUG nova.compute.manager [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:03:12 np0005546909 nova_compute[187208]: 2025-12-05 12:03:12.835 187212 DEBUG nova.network.neutron [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:03:13 np0005546909 nova_compute[187208]: 2025-12-05 12:03:13.701 187212 INFO nova.virt.libvirt.driver [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Snapshot image upload complete#033[00m
Dec  5 07:03:13 np0005546909 nova_compute[187208]: 2025-12-05 12:03:13.702 187212 INFO nova.compute.manager [None req-53fb7efe-5185-42ae-9e88-4bcbc057cc44 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Took 4.38 seconds to snapshot the instance on the hypervisor.#033[00m
Dec  5 07:03:13 np0005546909 nova_compute[187208]: 2025-12-05 12:03:13.737 187212 INFO nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Creating config drive at /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.config#033[00m
Dec  5 07:03:13 np0005546909 nova_compute[187208]: 2025-12-05 12:03:13.745 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp42r5frbt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:13 np0005546909 nova_compute[187208]: 2025-12-05 12:03:13.873 187212 DEBUG oslo_concurrency.processutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp42r5frbt" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:13 np0005546909 kernel: tap0a11e563-2b: entered promiscuous mode
Dec  5 07:03:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:13Z|00251|binding|INFO|Claiming lport 0a11e563-2be9-4ce9-af51-7d29b586e233 for this chassis.
Dec  5 07:03:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:13Z|00252|binding|INFO|0a11e563-2be9-4ce9-af51-7d29b586e233: Claiming fa:16:3e:f2:70:f2 10.100.0.12
Dec  5 07:03:13 np0005546909 NetworkManager[55691]: <info>  [1764936193.9327] manager: (tap0a11e563-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/108)
Dec  5 07:03:13 np0005546909 nova_compute[187208]: 2025-12-05 12:03:13.932 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.947 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:70:f2 10.100.0.12'], port_security=['fa:16:3e:f2:70:f2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd2085dd9-2ebd-4804-99c1-3b15cbd216f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d064000-316c-46a7-a23c-1dc26318b6a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79895287bd1d488c842f6013729a1f81', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e1ec2415-6840-4cf9-b5ac-efaf1a9c9a58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3804b014-203a-4c47-b0bb-7634579c4ec4, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0a11e563-2be9-4ce9-af51-7d29b586e233) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.949 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0a11e563-2be9-4ce9-af51-7d29b586e233 in datapath 5d064000-316c-46a7-a23c-1dc26318b6a4 bound to our chassis#033[00m
Dec  5 07:03:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.952 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5d064000-316c-46a7-a23c-1dc26318b6a4#033[00m
Dec  5 07:03:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:13Z|00253|binding|INFO|Setting lport 0a11e563-2be9-4ce9-af51-7d29b586e233 ovn-installed in OVS
Dec  5 07:03:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:13Z|00254|binding|INFO|Setting lport 0a11e563-2be9-4ce9-af51-7d29b586e233 up in Southbound
Dec  5 07:03:13 np0005546909 nova_compute[187208]: 2025-12-05 12:03:13.962 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.962 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[859e41c1-1655-4466-9f76-b3c3e6cc9db6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.963 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5d064000-31 in ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:03:13 np0005546909 systemd-udevd[220031]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:03:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.965 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5d064000-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:03:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.965 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f19c3456-297b-4d2f-b2fa-8918e3f4bd53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.966 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5890975b-423a-4312-a132-1a4b595fd80d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:13 np0005546909 NetworkManager[55691]: <info>  [1764936193.9782] device (tap0a11e563-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:03:13 np0005546909 systemd-machined[153543]: New machine qemu-37-instance-00000021.
Dec  5 07:03:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.977 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[6227a224-997f-4b35-805b-308ac445ce31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:13 np0005546909 NetworkManager[55691]: <info>  [1764936193.9796] device (tap0a11e563-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:03:13 np0005546909 systemd[1]: Started Virtual Machine qemu-37-instance-00000021.
Dec  5 07:03:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:13.990 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd6eb71-c305-4c85-84df-242ab645d3bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.011 187212 DEBUG nova.network.neutron [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.030 187212 INFO nova.compute.manager [-] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Took 1.19 seconds to deallocate network for instance.#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.031 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e38b47eb-0741-462d-886e-a7f8f82a8eeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.036 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[63e4fcde-19d0-40e1-8fa2-b29f4727d8d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:14 np0005546909 NetworkManager[55691]: <info>  [1764936194.0375] manager: (tap5d064000-30): new Veth device (/org/freedesktop/NetworkManager/Devices/109)
Dec  5 07:03:14 np0005546909 systemd-udevd[220035]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.042 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.043 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.066 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[209c5e36-290c-4bf8-bb21-e1f04c3d0ed8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.069 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e643bd-eae0-415c-885e-1c83210dcf3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.073 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:03:14 np0005546909 NetworkManager[55691]: <info>  [1764936194.0890] device (tap5d064000-30): carrier: link connected
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.093 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1f0107-b1de-484d-adab-5055f8f21145]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.094 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.095 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.108 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aedca57e-910e-40d2-a8c6-dac2c9cb5381]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d064000-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:6d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357465, 'reachable_time': 41440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220064, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.123 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c2efd67a-70d1-4a9a-a9ee-74323826e893]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:6d24'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 357465, 'tstamp': 357465}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220065, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.138 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1f461bd5-14ac-414b-b137-8bde8c36aa1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d064000-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:6d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357465, 'reachable_time': 41440, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220066, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.143 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.172 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ac3cfe-6204-4804-8e31-d086257e115b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.231 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4529b8-f846-4aa3-ab3b-ee1d6b40f967]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.233 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d064000-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.234 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.236 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d064000-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:14 np0005546909 NetworkManager[55691]: <info>  [1764936194.2948] manager: (tap5d064000-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Dec  5 07:03:14 np0005546909 kernel: tap5d064000-30: entered promiscuous mode
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.296 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.299 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5d064000-30, col_values=(('external_ids', {'iface-id': '1b49f23e-d835-4ef5-82b9-a339d97fd4cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:14Z|00255|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.302 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.302 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.303 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4f1779df-9682-4cce-99d8-c0b2fdd150d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.304 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-5d064000-316c-46a7-a23c-1dc26318b6a4
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 5d064000-316c-46a7-a23c-1dc26318b6a4
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:03:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:14.305 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'env', 'PROCESS_TAG=haproxy-5d064000-316c-46a7-a23c-1dc26318b6a4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5d064000-316c-46a7-a23c-1dc26318b6a4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.332 187212 DEBUG nova.compute.provider_tree [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.352 187212 DEBUG nova.scheduler.client.report [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.374 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.376 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.382 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.382 187212 INFO nova.compute.claims [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.420 187212 INFO nova.scheduler.client.report [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Deleted allocations for instance 9efa988a-19ae-440a-8a56-0bac68cb3c9e#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.490 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936194.4897525, d2085dd9-2ebd-4804-99c1-3b15cbd216f8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.490 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] VM Started (Lifecycle Event)#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.523 187212 DEBUG oslo_concurrency.lockutils [None req-34f70cb5-1f00-4cc6-9647-99ae63e555c8 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "9efa988a-19ae-440a-8a56-0bac68cb3c9e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.526 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.536 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936194.4922235, d2085dd9-2ebd-4804-99c1-3b15cbd216f8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.537 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.560 187212 DEBUG nova.compute.provider_tree [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.571 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.574 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.587 187212 DEBUG nova.scheduler.client.report [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.613 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.617 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.617 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.675 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.676 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:03:14 np0005546909 podman[220103]: 2025-12-05 12:03:14.690842485 +0000 UTC m=+0.049191376 container create 592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.694 187212 INFO nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.718 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:03:14 np0005546909 systemd[1]: Started libpod-conmon-592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c.scope.
Dec  5 07:03:14 np0005546909 podman[220103]: 2025-12-05 12:03:14.664343668 +0000 UTC m=+0.022692589 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:03:14 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:03:14 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f72b34854b6787c4cd40b26e6f22d36e2f45382e4691a696a9fc490f51c1bb73/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:03:14 np0005546909 podman[220103]: 2025-12-05 12:03:14.791802239 +0000 UTC m=+0.150151150 container init 592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:03:14 np0005546909 podman[220103]: 2025-12-05 12:03:14.79848132 +0000 UTC m=+0.156830211 container start 592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.803 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.804 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.805 187212 INFO nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Creating image(s)#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.805 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "/var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.805 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.806 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.820 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:14 np0005546909 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [NOTICE]   (220123) : New worker (220125) forked
Dec  5 07:03:14 np0005546909 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [NOTICE]   (220123) : Loading success.
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.895 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.896 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.897 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.910 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.930 187212 DEBUG nova.network.neutron [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Updated VIF entry in instance network info cache for port 0a11e563-2be9-4ce9-af51-7d29b586e233. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.932 187212 DEBUG nova.network.neutron [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Updating instance_info_cache with network_info: [{"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.957 187212 DEBUG nova.policy [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff425b7b04144f93a2c15e3a347fc15c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4671f6c82ea049fab3a314ecf45b7656', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.968 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:14 np0005546909 nova_compute[187208]: 2025-12-05 12:03:14.969 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.004 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.006 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.007 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.074 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.074 187212 DEBUG nova.virt.disk.api [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Checking if we can resize image /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.075 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.095 187212 DEBUG oslo_concurrency.lockutils [req-8cd9fc31-f663-41ef-a794-68dc9a28192c req-14f6e227-23f4-491c-ae08-e2b4599b4dd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-d2085dd9-2ebd-4804-99c1-3b15cbd216f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.141 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.142 187212 DEBUG nova.virt.disk.api [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Cannot resize image /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.142 187212 DEBUG nova.objects.instance [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'migration_context' on Instance uuid 478fa005-452c-4e37-a919-63bb734a3c5c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.156 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.157 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Ensure instance console log exists: /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.157 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.158 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.158 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:15 np0005546909 nova_compute[187208]: 2025-12-05 12:03:15.945 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Successfully created port: 7022c257-2ab5-436e-9757-387e9de66b18 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:03:16 np0005546909 nova_compute[187208]: 2025-12-05 12:03:16.827 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:16 np0005546909 nova_compute[187208]: 2025-12-05 12:03:16.829 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:16 np0005546909 nova_compute[187208]: 2025-12-05 12:03:16.829 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:16 np0005546909 nova_compute[187208]: 2025-12-05 12:03:16.830 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:16 np0005546909 nova_compute[187208]: 2025-12-05 12:03:16.830 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:16 np0005546909 nova_compute[187208]: 2025-12-05 12:03:16.832 187212 INFO nova.compute.manager [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Terminating instance#033[00m
Dec  5 07:03:16 np0005546909 nova_compute[187208]: 2025-12-05 12:03:16.833 187212 DEBUG nova.compute.manager [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:03:16 np0005546909 nova_compute[187208]: 2025-12-05 12:03:16.895 187212 DEBUG nova.compute.manager [req-c3ce9f82-67ca-4282-9ba5-1d1f1bc8894b req-dc37c75e-b531-4262-bcbe-ec6037bcfd7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 9efa988a-19ae-440a-8a56-0bac68cb3c9e] Received event network-vif-deleted-82089bf4-207e-4880-b8ff-9bf09a4ac3fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:16 np0005546909 kernel: tapb5ee44c8-34 (unregistering): left promiscuous mode
Dec  5 07:03:16 np0005546909 NetworkManager[55691]: <info>  [1764936196.9297] device (tapb5ee44c8-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:03:16 np0005546909 nova_compute[187208]: 2025-12-05 12:03:16.935 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:16 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:16Z|00256|binding|INFO|Releasing lport b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff from this chassis (sb_readonly=0)
Dec  5 07:03:16 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:16Z|00257|binding|INFO|Setting lport b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff down in Southbound
Dec  5 07:03:16 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:16Z|00258|binding|INFO|Removing iface tapb5ee44c8-34 ovn-installed in OVS
Dec  5 07:03:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:16.952 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:33:fe 10.100.0.9'], port_security=['fa:16:3e:2e:33:fe 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '342e6d694cf6482c9f1b7557a17bce60', 'neutron:revision_number': '4', 'neutron:security_group_ids': '710ea28e-d1ba-4c63-a751-16b460b2129b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a85cd729-c72e-4d3c-b444-ff0b42d436ff, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:16.954 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff in datapath 393d33f9-2dde-4fb5-b5db-3f0fb98d4637 unbound from our chassis#033[00m
Dec  5 07:03:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:16.956 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 393d33f9-2dde-4fb5-b5db-3f0fb98d4637, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:03:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:16.957 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d8e3679b-44e1-4437-a056-42e4369dff79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:16.958 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 namespace which is not needed anymore#033[00m
Dec  5 07:03:16 np0005546909 nova_compute[187208]: 2025-12-05 12:03:16.959 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:16 np0005546909 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Dec  5 07:03:16 np0005546909 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d0000001d.scope: Consumed 13.206s CPU time.
Dec  5 07:03:16 np0005546909 systemd-machined[153543]: Machine qemu-33-instance-0000001d terminated.
Dec  5 07:03:17 np0005546909 podman[220167]: 2025-12-05 12:03:17.026727235 +0000 UTC m=+0.070598557 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.090 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.091 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.092 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:17 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [NOTICE]   (218972) : haproxy version is 2.8.14-c23fe91
Dec  5 07:03:17 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [NOTICE]   (218972) : path to executable is /usr/sbin/haproxy
Dec  5 07:03:17 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [WARNING]  (218972) : Exiting Master process...
Dec  5 07:03:17 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [WARNING]  (218972) : Exiting Master process...
Dec  5 07:03:17 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [ALERT]    (218972) : Current worker (218974) exited with code 143 (Terminated)
Dec  5 07:03:17 np0005546909 neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637[218966]: [WARNING]  (218972) : All workers exited. Exiting... (0)
Dec  5 07:03:17 np0005546909 systemd[1]: libpod-1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5.scope: Deactivated successfully.
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.108 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:03:17 np0005546909 podman[220210]: 2025-12-05 12:03:17.114281506 +0000 UTC m=+0.047715594 container died 1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.121 187212 INFO nova.virt.libvirt.driver [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Instance destroyed successfully.#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.121 187212 DEBUG nova.objects.instance [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lazy-loading 'resources' on Instance uuid bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.145 187212 DEBUG nova.virt.libvirt.vif [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:02:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-479694898',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-479694898',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(25),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-479694898',id=29,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=25,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHclZt3lDeuFOP8poKE+ML8+DG1Fbw3aUsTnjf0HLJVz5RSbJGx4tv2GGPcCJx4ta3mNRAE5Oj+av9qQ6qgWWoPyu4x9SJdJ+NWU4lkfCG3kIVf4et9X/7mGn0JPIZgI2A==',key_name='tempest-keypair-270659961',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:02:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='342e6d694cf6482c9f1b7557a17bce60',ramdisk_id='',reservation_id='r-70canao4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-1976479976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:02:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79758a6c7516459bb1907270241d266a',uuid=bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.147 187212 DEBUG nova.network.os_vif_util [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converting VIF {"id": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "address": "fa:16:3e:2e:33:fe", "network": {"id": "393d33f9-2dde-4fb5-b5db-3f0fb98d4637", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1250645992-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "342e6d694cf6482c9f1b7557a17bce60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5ee44c8-34", "ovs_interfaceid": "b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.148 187212 DEBUG nova.network.os_vif_util [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.148 187212 DEBUG os_vif [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.150 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.150 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5ee44c8-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.151 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.153 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.155 187212 INFO os_vif [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:33:fe,bridge_name='br-int',has_traffic_filtering=True,id=b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff,network=Network(393d33f9-2dde-4fb5-b5db-3f0fb98d4637),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5ee44c8-34')#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.156 187212 INFO nova.virt.libvirt.driver [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Deleting instance files /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77_del#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.157 187212 INFO nova.virt.libvirt.driver [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Deletion of /var/lib/nova/instances/bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77_del complete#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.181 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.182 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.189 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.190 187212 INFO nova.compute.claims [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:03:17 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5-userdata-shm.mount: Deactivated successfully.
Dec  5 07:03:17 np0005546909 systemd[1]: var-lib-containers-storage-overlay-61c2e9c401d8673a2e5697d0fcec7ebf4e56c09bb215df29178f1e818f6cd815-merged.mount: Deactivated successfully.
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.241 187212 INFO nova.compute.manager [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.242 187212 DEBUG oslo.service.loopingcall [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.242 187212 DEBUG nova.compute.manager [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.242 187212 DEBUG nova.network.neutron [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:03:17 np0005546909 podman[220210]: 2025-12-05 12:03:17.248440838 +0000 UTC m=+0.181874936 container cleanup 1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec  5 07:03:17 np0005546909 systemd[1]: libpod-conmon-1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5.scope: Deactivated successfully.
Dec  5 07:03:17 np0005546909 podman[220261]: 2025-12-05 12:03:17.308508554 +0000 UTC m=+0.038340146 container remove 1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.312 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Successfully updated port: 7022c257-2ab5-436e-9757-387e9de66b18 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:03:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.314 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[187c5310-6073-4d07-9479-5a6d2bc97470]: (4, ('Fri Dec  5 12:03:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 (1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5)\n1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5\nFri Dec  5 12:03:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 (1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5)\n1a1c5f63e9a06d7a9f8bc2a4faa4468649247cd6631be1ebee1d000bf13f01c5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.316 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9713152f-80dd-4f2c-a8b1-ab53510e2012]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.317 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap393d33f9-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.319 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:17 np0005546909 kernel: tap393d33f9-20: left promiscuous mode
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.330 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.330 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquired lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.331 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.331 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.334 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0143b9aa-6cb0-439e-8519-5d66b15e2844]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.356 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0652fc-58cc-4e40-ae61-53d8b6920aef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.359 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1484b01c-839c-4b03-b929-f4959e58ce14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.372 187212 DEBUG nova.compute.provider_tree [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.384 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[91d65e43-153a-4d15-88a9-1c9d414693d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354680, 'reachable_time': 37899, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220276, 'error': None, 'target': 'ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:17 np0005546909 systemd[1]: run-netns-ovnmeta\x2d393d33f9\x2d2dde\x2d4fb5\x2db5db\x2d3f0fb98d4637.mount: Deactivated successfully.
Dec  5 07:03:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.388 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-393d33f9-2dde-4fb5-b5db-3f0fb98d4637 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:03:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:17.388 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2aecd1-d47e-4fa2-8bda-03099d94f26a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.389 187212 DEBUG nova.scheduler.client.report [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.416 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.417 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.459 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.459 187212 DEBUG nova.network.neutron [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.481 187212 INFO nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.498 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.543 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:17Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a9:8a:0c 10.100.0.12
Dec  5 07:03:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:17Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a9:8a:0c 10.100.0.12
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.588 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.589 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.590 187212 INFO nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Creating image(s)#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.590 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "/var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.590 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "/var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.591 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "/var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.607 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.674 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.675 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.676 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.688 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.716 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.748 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.749 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.770 187212 DEBUG nova.policy [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '18569d5748e8448fbd1bcbf5d37ff5f6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70409a2f9710408cb377a61250853fbd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.785 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.785 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.786 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.860 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.861 187212 DEBUG nova.virt.disk.api [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Checking if we can resize image /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.861 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.919 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.921 187212 DEBUG nova.virt.disk.api [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Cannot resize image /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.922 187212 DEBUG nova.objects.instance [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lazy-loading 'migration_context' on Instance uuid 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.939 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.940 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Ensure instance console log exists: /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.940 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.940 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:17 np0005546909 nova_compute[187208]: 2025-12-05 12:03:17.941 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:18 np0005546909 nova_compute[187208]: 2025-12-05 12:03:18.888 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:18 np0005546909 nova_compute[187208]: 2025-12-05 12:03:18.888 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:18 np0005546909 nova_compute[187208]: 2025-12-05 12:03:18.908 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:03:18 np0005546909 nova_compute[187208]: 2025-12-05 12:03:18.985 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:18 np0005546909 nova_compute[187208]: 2025-12-05 12:03:18.985 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:18 np0005546909 nova_compute[187208]: 2025-12-05 12:03:18.986 187212 DEBUG nova.network.neutron [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:18 np0005546909 nova_compute[187208]: 2025-12-05 12:03:18.996 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:03:18 np0005546909 nova_compute[187208]: 2025-12-05 12:03:18.996 187212 INFO nova.compute.claims [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.010 187212 INFO nova.compute.manager [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Took 1.77 seconds to deallocate network for instance.#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.093 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.364 187212 DEBUG nova.compute.provider_tree [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.383 187212 DEBUG nova.scheduler.client.report [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.411 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.412 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.414 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.433 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.506 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.507 187212 DEBUG nova.network.neutron [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.534 187212 INFO nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.558 187212 DEBUG nova.network.neutron [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Successfully created port: a35b6b13-07bc-4c91-aaf5-231163a6ea44 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.574 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.668 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Updating instance_info_cache with network_info: [{"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.685 187212 DEBUG nova.compute.provider_tree [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.701 187212 DEBUG nova.scheduler.client.report [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.711 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Releasing lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.712 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance network_info: |[{"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.717 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Start _get_guest_xml network_info=[{"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.722 187212 WARNING nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.729 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.735 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.737 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.738 187212 INFO nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Creating image(s)#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.739 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "/var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.740 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.741 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.762 187212 DEBUG nova.virt.libvirt.host [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.764 187212 DEBUG nova.virt.libvirt.host [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.766 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.794 187212 INFO nova.scheduler.client.report [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Deleted allocations for instance bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.801 187212 DEBUG nova.virt.libvirt.host [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.802 187212 DEBUG nova.virt.libvirt.host [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.802 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.803 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.803 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.803 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.804 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.804 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.804 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.804 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.804 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.804 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.805 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.805 187212 DEBUG nova.virt.hardware [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.809 187212 DEBUG nova.virt.libvirt.vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1018127156',display_name='tempest-DeleteServersTestJSON-server-1018127156',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1018127156',id=34,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-oru1ft1x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:14Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=478fa005-452c-4e37-a919-63bb734a3c5c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.810 187212 DEBUG nova.network.os_vif_util [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.811 187212 DEBUG nova.network.os_vif_util [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.812 187212 DEBUG nova.objects.instance [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'pci_devices' on Instance uuid 478fa005-452c-4e37-a919-63bb734a3c5c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.832 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  <uuid>478fa005-452c-4e37-a919-63bb734a3c5c</uuid>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  <name>instance-00000022</name>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <nova:name>tempest-DeleteServersTestJSON-server-1018127156</nova:name>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:03:19</nova:creationTime>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:        <nova:user uuid="ff425b7b04144f93a2c15e3a347fc15c">tempest-DeleteServersTestJSON-554028480-project-member</nova:user>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:        <nova:project uuid="4671f6c82ea049fab3a314ecf45b7656">tempest-DeleteServersTestJSON-554028480</nova:project>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:        <nova:port uuid="7022c257-2ab5-436e-9757-387e9de66b18">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <entry name="serial">478fa005-452c-4e37-a919-63bb734a3c5c</entry>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <entry name="uuid">478fa005-452c-4e37-a919-63bb734a3c5c</entry>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.config"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:57:8e:0e"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <target dev="tap7022c257-2a"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/console.log" append="off"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:03:19 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:03:19 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:03:19 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:03:19 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.832 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Preparing to wait for external event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.833 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.833 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.833 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.834 187212 DEBUG nova.virt.libvirt.vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1018127156',display_name='tempest-DeleteServersTestJSON-server-1018127156',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1018127156',id=34,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-oru1ft1x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:14Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=478fa005-452c-4e37-a919-63bb734a3c5c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.834 187212 DEBUG nova.network.os_vif_util [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.835 187212 DEBUG nova.network.os_vif_util [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.835 187212 DEBUG os_vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.838 187212 DEBUG nova.policy [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.840 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.840 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.841 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.845 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.845 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7022c257-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.845 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7022c257-2a, col_values=(('external_ids', {'iface-id': '7022c257-2ab5-436e-9757-387e9de66b18', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:8e:0e', 'vm-uuid': '478fa005-452c-4e37-a919-63bb734a3c5c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.847 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:19 np0005546909 NetworkManager[55691]: <info>  [1764936199.8488] manager: (tap7022c257-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.853 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.854 187212 INFO os_vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a')#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.869 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.869 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.870 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.882 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.907 187212 DEBUG oslo_concurrency.lockutils [None req-ff74afd5-33cd-40fd-8296-35e1b1586b01 79758a6c7516459bb1907270241d266a 342e6d694cf6482c9f1b7557a17bce60 - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.946 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:19 np0005546909 nova_compute[187208]: 2025-12-05 12:03:19.947 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.081 187212 DEBUG nova.compute.manager [req-9fde6424-556b-40b5-833a-0a4772193d11 req-7df42078-30e0-495b-9094-a91bdf67964d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-vif-unplugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.081 187212 DEBUG oslo_concurrency.lockutils [req-9fde6424-556b-40b5-833a-0a4772193d11 req-7df42078-30e0-495b-9094-a91bdf67964d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.081 187212 DEBUG oslo_concurrency.lockutils [req-9fde6424-556b-40b5-833a-0a4772193d11 req-7df42078-30e0-495b-9094-a91bdf67964d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.082 187212 DEBUG oslo_concurrency.lockutils [req-9fde6424-556b-40b5-833a-0a4772193d11 req-7df42078-30e0-495b-9094-a91bdf67964d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.082 187212 DEBUG nova.compute.manager [req-9fde6424-556b-40b5-833a-0a4772193d11 req-7df42078-30e0-495b-9094-a91bdf67964d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] No waiting events found dispatching network-vif-unplugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.082 187212 WARNING nova.compute.manager [req-9fde6424-556b-40b5-833a-0a4772193d11 req-7df42078-30e0-495b-9094-a91bdf67964d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received unexpected event network-vif-unplugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.123 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.124 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.124 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No VIF found with MAC fa:16:3e:57:8e:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.125 187212 INFO nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Using config drive#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.327 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936185.3261204, 2f42f732-65c6-4c4a-9332-47098d7350b9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.327 187212 INFO nova.compute.manager [-] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.358 187212 DEBUG nova.compute.manager [None req-6288a850-328f-41de-bbcd-77e52ec5cbf5 - - - - - -] [instance: 2f42f732-65c6-4c4a-9332-47098d7350b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.393 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk 1073741824" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.393 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.394 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.461 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.462 187212 DEBUG nova.virt.disk.api [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Checking if we can resize image /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.463 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.523 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.524 187212 DEBUG nova.virt.disk.api [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Cannot resize image /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.524 187212 DEBUG nova.objects.instance [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'migration_context' on Instance uuid c1e2f189-1777-4f28-97ab-72cf0f60fbc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.541 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.542 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Ensure instance console log exists: /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.542 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.543 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.543 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.871 187212 DEBUG nova.network.neutron [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Successfully created port: ecec1a41-6f3e-4852-8cdb-9d461eded987 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.905 187212 DEBUG nova.network.neutron [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Successfully updated port: a35b6b13-07bc-4c91-aaf5-231163a6ea44 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.928 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "refresh_cache-05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.928 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquired lock "refresh_cache-05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:20 np0005546909 nova_compute[187208]: 2025-12-05 12:03:20.928 187212 DEBUG nova.network.neutron [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:03:21 np0005546909 nova_compute[187208]: 2025-12-05 12:03:21.111 187212 DEBUG nova.network.neutron [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:03:21 np0005546909 nova_compute[187208]: 2025-12-05 12:03:21.522 187212 INFO nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Creating config drive at /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.config#033[00m
Dec  5 07:03:21 np0005546909 nova_compute[187208]: 2025-12-05 12:03:21.529 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj3w8vgx5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:21 np0005546909 nova_compute[187208]: 2025-12-05 12:03:21.655 187212 DEBUG oslo_concurrency.processutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj3w8vgx5" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:21 np0005546909 kernel: tap7022c257-2a: entered promiscuous mode
Dec  5 07:03:21 np0005546909 NetworkManager[55691]: <info>  [1764936201.7208] manager: (tap7022c257-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Dec  5 07:03:21 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:21Z|00259|binding|INFO|Claiming lport 7022c257-2ab5-436e-9757-387e9de66b18 for this chassis.
Dec  5 07:03:21 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:21Z|00260|binding|INFO|7022c257-2ab5-436e-9757-387e9de66b18: Claiming fa:16:3e:57:8e:0e 10.100.0.4
Dec  5 07:03:21 np0005546909 nova_compute[187208]: 2025-12-05 12:03:21.721 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:21 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:21Z|00261|binding|INFO|Setting lport 7022c257-2ab5-436e-9757-387e9de66b18 ovn-installed in OVS
Dec  5 07:03:21 np0005546909 nova_compute[187208]: 2025-12-05 12:03:21.733 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:21 np0005546909 nova_compute[187208]: 2025-12-05 12:03:21.739 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:21 np0005546909 systemd-udevd[220325]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:03:21 np0005546909 systemd-machined[153543]: New machine qemu-38-instance-00000022.
Dec  5 07:03:21 np0005546909 NetworkManager[55691]: <info>  [1764936201.7621] device (tap7022c257-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:03:21 np0005546909 NetworkManager[55691]: <info>  [1764936201.7631] device (tap7022c257-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:03:21 np0005546909 systemd[1]: Started Virtual Machine qemu-38-instance-00000022.
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.841 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:8e:0e 10.100.0.4'], port_security=['fa:16:3e:57:8e:0e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '478fa005-452c-4e37-a919-63bb734a3c5c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7022c257-2ab5-436e-9757-387e9de66b18) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:21 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:21Z|00262|binding|INFO|Setting lport 7022c257-2ab5-436e-9757-387e9de66b18 up in Southbound
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.842 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7022c257-2ab5-436e-9757-387e9de66b18 in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 bound to our chassis#033[00m
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.845 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70#033[00m
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.861 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c85c3c9-7167-4be7-ad85-ef605ad4b96b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.862 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7360f84-b1 in ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.864 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7360f84-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.865 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c96b232c-0a71-4984-be48-835fe33dbe94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.865 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4bfa64d7-7e1c-434f-aa07-2a0a49d1f2ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.879 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c6798d97-86b4-41fb-b94f-3da1b1610024]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.905 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e13e184a-a6b9-48f2-9b1d-febc5ee87c6e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.937 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3c1b80c5-0424-412f-a988-157b9994151f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.942 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed84fbf4-1cb2-4651-bbc8-ab73046f622f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:21 np0005546909 NetworkManager[55691]: <info>  [1764936201.9436] manager: (tapd7360f84-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/113)
Dec  5 07:03:21 np0005546909 nova_compute[187208]: 2025-12-05 12:03:21.977 187212 DEBUG nova.network.neutron [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Successfully updated port: ecec1a41-6f3e-4852-8cdb-9d461eded987 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.978 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[74e1e546-f343-44cc-88ad-9c5320f70140]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:21 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:21.981 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6d65b9ed-a376-4b5f-8383-25740cdb586e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:21 np0005546909 nova_compute[187208]: 2025-12-05 12:03:21.999 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "refresh_cache-c1e2f189-1777-4f28-97ab-72cf0f60fbc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:21.999 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquired lock "refresh_cache-c1e2f189-1777-4f28-97ab-72cf0f60fbc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.000 187212 DEBUG nova.network.neutron [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:03:22 np0005546909 NetworkManager[55691]: <info>  [1764936202.0042] device (tapd7360f84-b0): carrier: link connected
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.010 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[804e9388-f52b-48c7-ba1c-a5cae8ca0ffa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.024 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8b6f9d6a-ba23-43b5-813f-6c4cbb49f98f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358256, 'reachable_time': 37299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220367, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.040 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4e49f2d4-eef3-40f3-b2c4-26ca087e3b1e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:2b52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358256, 'tstamp': 358256}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220368, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.058 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4883f56e-a4ef-46f9-8fdc-f56d4ebf01c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358256, 'reachable_time': 37299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220369, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.085 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[99aa8a0d-392c-404f-a9e8-4776c5b70758]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.138 187212 DEBUG nova.network.neutron [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Updating instance_info_cache with network_info: [{"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.139 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f68db8d4-4007-4aa9-90c8-a8a26ba98a8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.141 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.141 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.142 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7360f84-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.145 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:22 np0005546909 NetworkManager[55691]: <info>  [1764936202.1464] manager: (tapd7360f84-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Dec  5 07:03:22 np0005546909 kernel: tapd7360f84-b0: entered promiscuous mode
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.148 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7360f84-b0, col_values=(('external_ids', {'iface-id': 'd85bc323-c3ce-47e3-ac1f-d5f27467a4e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.149 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:22Z|00263|binding|INFO|Releasing lport d85bc323-c3ce-47e3-ac1f-d5f27467a4e9 from this chassis (sb_readonly=0)
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.151 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.152 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.153 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6f535c-75a9-4d92-b84e-485dcaff9264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.154 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:03:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:22.156 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'env', 'PROCESS_TAG=haproxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.164 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936202.1635714, 478fa005-452c-4e37-a919-63bb734a3c5c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.164 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] VM Started (Lifecycle Event)#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.166 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Releasing lock "refresh_cache-05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.166 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Instance network_info: |[{"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.168 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Start _get_guest_xml network_info=[{"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.171 187212 WARNING nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.177 187212 DEBUG nova.virt.libvirt.host [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.177 187212 DEBUG nova.virt.libvirt.host [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.180 187212 DEBUG nova.virt.libvirt.host [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.180 187212 DEBUG nova.virt.libvirt.host [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.181 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.181 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.182 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.182 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.182 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.182 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.183 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.183 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.183 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.183 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.184 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.184 187212 DEBUG nova.virt.hardware [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.188 187212 DEBUG nova.virt.libvirt.vif [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-378209780',display_name='tempest-InstanceActionsNegativeTestJSON-server-378209780',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-378209780',id=35,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70409a2f9710408cb377a61250853fbd',ramdisk_id='',reservation_id='r-1rqpjj9v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1806311246',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1806311246-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:17Z,user_data=None,user_id='18569d5748e8448fbd1bcbf5d37ff5f6',uuid=05008cd8-8cac-482b-9ff8-68f2f0aaa6d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.189 187212 DEBUG nova.network.os_vif_util [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Converting VIF {"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.190 187212 DEBUG nova.network.os_vif_util [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.191 187212 DEBUG nova.objects.instance [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lazy-loading 'pci_devices' on Instance uuid 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.194 187212 DEBUG nova.network.neutron [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.197 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.202 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936202.1636398, 478fa005-452c-4e37-a919-63bb734a3c5c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.202 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.217 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  <uuid>05008cd8-8cac-482b-9ff8-68f2f0aaa6d4</uuid>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  <name>instance-00000023</name>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <nova:name>tempest-InstanceActionsNegativeTestJSON-server-378209780</nova:name>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:03:22</nova:creationTime>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:        <nova:user uuid="18569d5748e8448fbd1bcbf5d37ff5f6">tempest-InstanceActionsNegativeTestJSON-1806311246-project-member</nova:user>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:        <nova:project uuid="70409a2f9710408cb377a61250853fbd">tempest-InstanceActionsNegativeTestJSON-1806311246</nova:project>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:        <nova:port uuid="a35b6b13-07bc-4c91-aaf5-231163a6ea44">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <entry name="serial">05008cd8-8cac-482b-9ff8-68f2f0aaa6d4</entry>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <entry name="uuid">05008cd8-8cac-482b-9ff8-68f2f0aaa6d4</entry>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.config"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:91:a5:f2"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <target dev="tapa35b6b13-07"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/console.log" append="off"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:03:22 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:03:22 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:03:22 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:03:22 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.218 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Preparing to wait for external event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.218 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.219 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.219 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.220 187212 DEBUG nova.virt.libvirt.vif [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-378209780',display_name='tempest-InstanceActionsNegativeTestJSON-server-378209780',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-378209780',id=35,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70409a2f9710408cb377a61250853fbd',ramdisk_id='',reservation_id='r-1rqpjj9v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1806311246',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1806311246-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:17Z,user_data=None,user_id='18569d5748e8448fbd1bcbf5d37ff5f6',uuid=05008cd8-8cac-482b-9ff8-68f2f0aaa6d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.220 187212 DEBUG nova.network.os_vif_util [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Converting VIF {"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.221 187212 DEBUG nova.network.os_vif_util [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.221 187212 DEBUG os_vif [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.223 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.223 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.223 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.224 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.232 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.233 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa35b6b13-07, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.234 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa35b6b13-07, col_values=(('external_ids', {'iface-id': 'a35b6b13-07bc-4c91-aaf5-231163a6ea44', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:a5:f2', 'vm-uuid': '05008cd8-8cac-482b-9ff8-68f2f0aaa6d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:22 np0005546909 NetworkManager[55691]: <info>  [1764936202.2372] manager: (tapa35b6b13-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.236 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.240 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: deleting, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.243 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.245 187212 INFO os_vif [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07')#033[00m
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.271 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Dec  5 07:03:22 np0005546909 podman[220382]: 2025-12-05 12:03:22.341674387 +0000 UTC m=+0.057192474 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Dec  5 07:03:22 np0005546909 podman[220383]: 2025-12-05 12:03:22.379113117 +0000 UTC m=+0.089544559 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  5 07:03:22 np0005546909 nova_compute[187208]: 2025-12-05 12:03:22.545 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:22 np0005546909 podman[220444]: 2025-12-05 12:03:22.498721813 +0000 UTC m=+0.026285852 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.046 187212 DEBUG nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.048 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.049 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.049 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.049 187212 DEBUG nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Processing event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.050 187212 DEBUG nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.050 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.051 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.051 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.051 187212 DEBUG nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] No waiting events found dispatching network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.052 187212 WARNING nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received unexpected event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.052 187212 DEBUG nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received event network-changed-7022c257-2ab5-436e-9757-387e9de66b18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.052 187212 DEBUG nova.compute.manager [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Refreshing instance network info cache due to event network-changed-7022c257-2ab5-436e-9757-387e9de66b18. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.053 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.053 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.053 187212 DEBUG nova.network.neutron [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Refreshing network info cache for port 7022c257-2ab5-436e-9757-387e9de66b18 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.055 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.060 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936203.0599492, d2085dd9-2ebd-4804-99c1-3b15cbd216f8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.060 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.062 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.066 187212 INFO nova.virt.libvirt.driver [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Instance spawned successfully.#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.067 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.084 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.088 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.092 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.092 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.093 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.093 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.094 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.094 187212 DEBUG nova.virt.libvirt.driver [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.113 187212 DEBUG nova.network.neutron [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Updating instance_info_cache with network_info: [{"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.124 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.146 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Releasing lock "refresh_cache-c1e2f189-1777-4f28-97ab-72cf0f60fbc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.146 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Instance network_info: |[{"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.149 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Start _get_guest_xml network_info=[{"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.154 187212 WARNING nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.157 187212 INFO nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Took 17.53 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.157 187212 DEBUG nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.164 187212 DEBUG nova.virt.libvirt.host [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.165 187212 DEBUG nova.virt.libvirt.host [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.168 187212 DEBUG nova.virt.libvirt.host [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.169 187212 DEBUG nova.virt.libvirt.host [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.169 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.169 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.170 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.170 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.170 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.170 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.171 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.171 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.171 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.171 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.172 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.172 187212 DEBUG nova.virt.hardware [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.176 187212 DEBUG nova.virt.libvirt.vif [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1049650520',display_name='tempest-ImagesTestJSON-server-1049650520',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1049650520',id=36,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-kquxoeat',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:19Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=c1e2f189-1777-4f28-97ab-72cf0f60fbc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.176 187212 DEBUG nova.network.os_vif_util [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.177 187212 DEBUG nova.network.os_vif_util [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.178 187212 DEBUG nova.objects.instance [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid c1e2f189-1777-4f28-97ab-72cf0f60fbc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.342 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  <uuid>c1e2f189-1777-4f28-97ab-72cf0f60fbc0</uuid>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  <name>instance-00000024</name>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <nova:name>tempest-ImagesTestJSON-server-1049650520</nova:name>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:03:23</nova:creationTime>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:        <nova:user uuid="a00ac4435e6647779ffaf4a5cde18fdb">tempest-ImagesTestJSON-276789408-project-member</nova:user>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:        <nova:project uuid="43e63f5c6b0f4840ad4df23fb5c10764">tempest-ImagesTestJSON-276789408</nova:project>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:        <nova:port uuid="ecec1a41-6f3e-4852-8cdb-9d461eded987">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <entry name="serial">c1e2f189-1777-4f28-97ab-72cf0f60fbc0</entry>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <entry name="uuid">c1e2f189-1777-4f28-97ab-72cf0f60fbc0</entry>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.config"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:57:88:7f"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <target dev="tapecec1a41-6f"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/console.log" append="off"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:03:23 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:03:23 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:03:23 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:03:23 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.343 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Preparing to wait for external event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.343 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.344 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.344 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.345 187212 DEBUG nova.virt.libvirt.vif [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1049650520',display_name='tempest-ImagesTestJSON-server-1049650520',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1049650520',id=36,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-kquxoeat',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:19Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=c1e2f189-1777-4f28-97ab-72cf0f60fbc0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.345 187212 DEBUG nova.network.os_vif_util [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.346 187212 DEBUG nova.network.os_vif_util [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.346 187212 DEBUG os_vif [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.347 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.347 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.347 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.350 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.350 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapecec1a41-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.351 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapecec1a41-6f, col_values=(('external_ids', {'iface-id': 'ecec1a41-6f3e-4852-8cdb-9d461eded987', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:57:88:7f', 'vm-uuid': 'c1e2f189-1777-4f28-97ab-72cf0f60fbc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.352 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:23 np0005546909 NetworkManager[55691]: <info>  [1764936203.3539] manager: (tapecec1a41-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.354 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.362 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.364 187212 INFO os_vif [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f')#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.534 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.535 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.535 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] No VIF found with MAC fa:16:3e:91:a5:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.536 187212 INFO nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Using config drive#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.546 187212 INFO nova.compute.manager [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Took 18.41 seconds to build instance.#033[00m
Dec  5 07:03:23 np0005546909 podman[220444]: 2025-12-05 12:03:23.799870387 +0000 UTC m=+1.327434426 container create 8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.816 187212 DEBUG oslo_concurrency.lockutils [None req-c289406f-7dc2-4fc6-a5b6-3396e11127d5 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.823 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.824 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.824 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No VIF found with MAC fa:16:3e:57:88:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:03:23 np0005546909 nova_compute[187208]: 2025-12-05 12:03:23.825 187212 INFO nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Using config drive#033[00m
Dec  5 07:03:24 np0005546909 systemd[1]: Started libpod-conmon-8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43.scope.
Dec  5 07:03:24 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:03:24 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ded9da6901c69416b75c147b588f572cf065df3fa0a5ca976e39b2a29f8769f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:03:24 np0005546909 podman[220444]: 2025-12-05 12:03:24.230407185 +0000 UTC m=+1.757971234 container init 8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:03:24 np0005546909 podman[220444]: 2025-12-05 12:03:24.237105396 +0000 UTC m=+1.764669405 container start 8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:03:24 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [NOTICE]   (220467) : New worker (220469) forked
Dec  5 07:03:24 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [NOTICE]   (220467) : Loading success.
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.422 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.423 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.423 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.424 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.424 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] No waiting events found dispatching network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.425 187212 WARNING nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received unexpected event network-vif-plugged-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.425 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Received event network-vif-deleted-b5ee44c8-3435-44a9-87e0-c63ad3d4c3ff external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.426 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-changed-a35b6b13-07bc-4c91-aaf5-231163a6ea44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.426 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Refreshing instance network info cache due to event network-changed-a35b6b13-07bc-4c91-aaf5-231163a6ea44. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.427 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.427 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.428 187212 DEBUG nova.network.neutron [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Refreshing network info cache for port a35b6b13-07bc-4c91-aaf5-231163a6ea44 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.549 187212 INFO nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Creating config drive at /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.config#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.554 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpilfh4oqp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.591 187212 INFO nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Creating config drive at /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.config#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.598 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplqla7n8s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.704 187212 DEBUG oslo_concurrency.processutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpilfh4oqp" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.727 187212 DEBUG oslo_concurrency.processutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplqla7n8s" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:25 np0005546909 kernel: tapa35b6b13-07: entered promiscuous mode
Dec  5 07:03:25 np0005546909 kernel: tapecec1a41-6f: entered promiscuous mode
Dec  5 07:03:25 np0005546909 NetworkManager[55691]: <info>  [1764936205.8103] manager: (tapa35b6b13-07): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Dec  5 07:03:25 np0005546909 NetworkManager[55691]: <info>  [1764936205.8127] manager: (tapecec1a41-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.814 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:25Z|00264|binding|INFO|Claiming lport a35b6b13-07bc-4c91-aaf5-231163a6ea44 for this chassis.
Dec  5 07:03:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:25Z|00265|binding|INFO|a35b6b13-07bc-4c91-aaf5-231163a6ea44: Claiming fa:16:3e:91:a5:f2 10.100.0.10
Dec  5 07:03:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:25Z|00266|binding|INFO|Claiming lport ecec1a41-6f3e-4852-8cdb-9d461eded987 for this chassis.
Dec  5 07:03:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:25Z|00267|binding|INFO|ecec1a41-6f3e-4852-8cdb-9d461eded987: Claiming fa:16:3e:57:88:7f 10.100.0.5
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.832 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:88:7f 10.100.0.5'], port_security=['fa:16:3e:57:88:7f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c1e2f189-1777-4f28-97ab-72cf0f60fbc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ecec1a41-6f3e-4852-8cdb-9d461eded987) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.834 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:a5:f2 10.100.0.10'], port_security=['fa:16:3e:91:a5:f2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '05008cd8-8cac-482b-9ff8-68f2f0aaa6d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5223579-477c-4fbe-a58c-2e56f428541c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70409a2f9710408cb377a61250853fbd', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cfbac9bb-a6fa-4e30-b1e0-c07877ef21de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85bd0838-1b85-4e8e-bf67-d21df8aa9251, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=a35b6b13-07bc-4c91-aaf5-231163a6ea44) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.836 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ecec1a41-6f3e-4852-8cdb-9d461eded987 in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis#033[00m
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.838 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd#033[00m
Dec  5 07:03:25 np0005546909 systemd-udevd[220508]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:03:25 np0005546909 systemd-udevd[220507]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:03:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:25Z|00268|binding|INFO|Setting lport a35b6b13-07bc-4c91-aaf5-231163a6ea44 ovn-installed in OVS
Dec  5 07:03:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:25Z|00269|binding|INFO|Setting lport a35b6b13-07bc-4c91-aaf5-231163a6ea44 up in Southbound
Dec  5 07:03:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:25Z|00270|binding|INFO|Setting lport ecec1a41-6f3e-4852-8cdb-9d461eded987 ovn-installed in OVS
Dec  5 07:03:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:25Z|00271|binding|INFO|Setting lport ecec1a41-6f3e-4852-8cdb-9d461eded987 up in Southbound
Dec  5 07:03:25 np0005546909 nova_compute[187208]: 2025-12-05 12:03:25.847 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:25 np0005546909 NetworkManager[55691]: <info>  [1764936205.8570] device (tapa35b6b13-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.853 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9351a3c9-6c59-40fa-90a2-521b22e167c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.854 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41b3b495-c1 in ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:03:25 np0005546909 NetworkManager[55691]: <info>  [1764936205.8579] device (tapa35b6b13-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:03:25 np0005546909 NetworkManager[55691]: <info>  [1764936205.8584] device (tapecec1a41-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:03:25 np0005546909 NetworkManager[55691]: <info>  [1764936205.8589] device (tapecec1a41-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.860 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41b3b495-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.860 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35a871f8-e2fc-46ea-adc6-abf251ad251e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.861 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d8a6b21d-cd64-4104-b60d-41cb279b4af1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.877 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[44b7b886-d65d-454f-8888-5459cb6596b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:25 np0005546909 systemd-machined[153543]: New machine qemu-40-instance-00000024.
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.902 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f82b5f6-7be3-451f-8d25-7019cf5c9b66]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:25 np0005546909 systemd[1]: Started Virtual Machine qemu-40-instance-00000024.
Dec  5 07:03:25 np0005546909 systemd-machined[153543]: New machine qemu-39-instance-00000023.
Dec  5 07:03:25 np0005546909 systemd[1]: Started Virtual Machine qemu-39-instance-00000023.
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.931 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[547d3c23-693c-4dc8-8d48-28a30e046455]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:25 np0005546909 NetworkManager[55691]: <info>  [1764936205.9386] manager: (tap41b3b495-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/119)
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.942 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a57b26b7-884e-4b12-b33c-4db2b75e67e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.978 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6d682f75-0e00-4995-b155-37231bf6cd02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:25.981 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6d591a16-2366-4af4-b9ad-296fa95c2873]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:26 np0005546909 NetworkManager[55691]: <info>  [1764936206.0049] device (tap41b3b495-c0): carrier: link connected
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.010 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[922d7a46-9880-4190-90e4-f72185796573]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.024 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f4cf4ddb-c994-4104-9820-0469f81c539f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358656, 'reachable_time': 39414, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220553, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.039 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef11c15-c88f-4373-a93b-dd2342caf725]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:a102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358656, 'tstamp': 358656}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220554, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.053 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bc738682-3f58-412e-8029-845fa210aff1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358656, 'reachable_time': 39414, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220555, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.083 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d75b1964-8dff-4a7a-9194-ade635535dc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.152 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e112b7-df07-443a-93f8-83830ec3468a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.154 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.154 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.155 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:26 np0005546909 NetworkManager[55691]: <info>  [1764936206.1579] manager: (tap41b3b495-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.157 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:26 np0005546909 kernel: tap41b3b495-c0: entered promiscuous mode
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.159 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.163 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.165 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:26 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:26Z|00272|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.167 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.180 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eec68ef8-50f8-4101-ae66-75c469515f85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.182 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.183 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:03:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:26.185 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'env', 'PROCESS_TAG=haproxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41b3b495-c1c9-44c0-b1a3-a499df6548dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.211 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936206.2104788, c1e2f189-1777-4f28-97ab-72cf0f60fbc0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.211 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] VM Started (Lifecycle Event)#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.236 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.243 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936206.2115746, c1e2f189-1777-4f28-97ab-72cf0f60fbc0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.243 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.265 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.268 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.288 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.289 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936206.287661, 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.289 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] VM Started (Lifecycle Event)#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.310 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.313 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936206.2877815, 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.314 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.335 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.338 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:26 np0005546909 nova_compute[187208]: 2025-12-05 12:03:26.360 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:26 np0005546909 podman[220598]: 2025-12-05 12:03:26.567046288 +0000 UTC m=+0.028315900 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:03:26 np0005546909 podman[220598]: 2025-12-05 12:03:26.896298282 +0000 UTC m=+0.357567864 container create f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:03:26 np0005546909 systemd[1]: Started libpod-conmon-f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7.scope.
Dec  5 07:03:26 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:03:27 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d1e943a749671a1b77c4cdf6fa31b99e20bf9ab4dc014c673f02b47bd056cb36/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:03:27 np0005546909 podman[220598]: 2025-12-05 12:03:27.033079368 +0000 UTC m=+0.494348970 container init f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 07:03:27 np0005546909 podman[220598]: 2025-12-05 12:03:27.046223294 +0000 UTC m=+0.507492886 container start f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  5 07:03:27 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [NOTICE]   (220617) : New worker (220619) forked
Dec  5 07:03:27 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [NOTICE]   (220617) : Loading success.
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.144 104471 INFO neutron.agent.ovn.metadata.agent [-] Port a35b6b13-07bc-4c91-aaf5-231163a6ea44 in datapath f5223579-477c-4fbe-a58c-2e56f428541c unbound from our chassis#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.146 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5223579-477c-4fbe-a58c-2e56f428541c#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.155 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ff6879cb-e397-4590-b291-ba3154ae5f0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.156 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf5223579-41 in ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.158 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf5223579-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.158 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[08c02eee-3fec-4255-a2a0-4121ecb22b9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.159 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e5aa68f7-6701-4b63-abea-2c25ef7bd19b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.171 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[09731575-037f-4513-b992-a404974e3418]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.187 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f2f2fd3-0841-472b-9516-2f1bead1970b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.225 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e4d4443f-e937-4e91-99c6-e924c698ac5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.233 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2af2aae0-32c6-4723-a359-20f97e07dec3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 NetworkManager[55691]: <info>  [1764936207.2342] manager: (tapf5223579-40): new Veth device (/org/freedesktop/NetworkManager/Devices/121)
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.264 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[034b702b-91ea-4c5e-af34-e1558701446c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.266 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2fcad417-5231-4118-817e-badc1fb9b809]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 NetworkManager[55691]: <info>  [1764936207.2869] device (tapf5223579-40): carrier: link connected
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.290 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3f6e47-0246-44a1-aa0b-b58c840a268a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.308 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8c88e22e-b453-4486-a59d-2806c09b78fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5223579-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:8e:7d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358785, 'reachable_time': 37517, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220638, 'error': None, 'target': 'ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.320 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[538657de-66bc-417b-8228-2e7b666fd1a4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe45:8e7d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358785, 'tstamp': 358785}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220639, 'error': None, 'target': 'ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.334 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a8856616-704e-4b47-b961-ae01151f8661]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5223579-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:45:8e:7d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358785, 'reachable_time': 37517, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220640, 'error': None, 'target': 'ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.367 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[064b3a3b-6f50-4810-b04b-9a053d8d33c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.417 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4b321ccb-e8db-4589-a17f-a874dfad19ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.418 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5223579-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.419 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.419 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5223579-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:27 np0005546909 kernel: tapf5223579-40: entered promiscuous mode
Dec  5 07:03:27 np0005546909 nova_compute[187208]: 2025-12-05 12:03:27.450 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:27 np0005546909 NetworkManager[55691]: <info>  [1764936207.4533] manager: (tapf5223579-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Dec  5 07:03:27 np0005546909 nova_compute[187208]: 2025-12-05 12:03:27.454 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.455 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5223579-40, col_values=(('external_ids', {'iface-id': '41d0fbd3-22b2-4ee9-8c84-9f176e5ee865'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:27 np0005546909 nova_compute[187208]: 2025-12-05 12:03:27.456 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:27 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:27Z|00273|binding|INFO|Releasing lport 41d0fbd3-22b2-4ee9-8c84-9f176e5ee865 from this chassis (sb_readonly=0)
Dec  5 07:03:27 np0005546909 nova_compute[187208]: 2025-12-05 12:03:27.470 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.472 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f5223579-477c-4fbe-a58c-2e56f428541c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f5223579-477c-4fbe-a58c-2e56f428541c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.473 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ec826654-6804-4387-85ce-e05977c26caf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.474 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-f5223579-477c-4fbe-a58c-2e56f428541c
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/f5223579-477c-4fbe-a58c-2e56f428541c.pid.haproxy
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID f5223579-477c-4fbe-a58c-2e56f428541c
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:03:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:27.475 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c', 'env', 'PROCESS_TAG=haproxy-f5223579-477c-4fbe-a58c-2e56f428541c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f5223579-477c-4fbe-a58c-2e56f428541c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:03:27 np0005546909 nova_compute[187208]: 2025-12-05 12:03:27.548 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:27 np0005546909 nova_compute[187208]: 2025-12-05 12:03:27.807 187212 DEBUG nova.network.neutron [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Updated VIF entry in instance network info cache for port 7022c257-2ab5-436e-9757-387e9de66b18. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:03:27 np0005546909 nova_compute[187208]: 2025-12-05 12:03:27.808 187212 DEBUG nova.network.neutron [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Updating instance_info_cache with network_info: [{"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:27 np0005546909 nova_compute[187208]: 2025-12-05 12:03:27.834 187212 DEBUG oslo_concurrency.lockutils [req-37c505a5-4dca-4fbf-82f5-c79380d4f291 req-39590403-6158-45b1-a31d-587a31287cac 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:27 np0005546909 podman[220673]: 2025-12-05 12:03:27.832792161 +0000 UTC m=+0.023153043 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:03:27 np0005546909 podman[220673]: 2025-12-05 12:03:27.983871726 +0000 UTC m=+0.174232578 container create 80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  5 07:03:28 np0005546909 systemd[1]: Started libpod-conmon-80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a.scope.
Dec  5 07:03:28 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:03:28 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78756f8a0c787d9c7273ba58bfae472f50372e3fa267a0cd0bfee40b23f73738/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:03:28 np0005546909 podman[220673]: 2025-12-05 12:03:28.153644155 +0000 UTC m=+0.344005017 container init 80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.155 187212 DEBUG nova.compute.manager [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.155 187212 DEBUG oslo_concurrency.lockutils [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.156 187212 DEBUG oslo_concurrency.lockutils [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.156 187212 DEBUG oslo_concurrency.lockutils [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:28 np0005546909 podman[220686]: 2025-12-05 12:03:28.156891998 +0000 UTC m=+0.136227952 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.156 187212 DEBUG nova.compute.manager [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Processing event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.158 187212 DEBUG nova.compute.manager [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.158 187212 DEBUG oslo_concurrency.lockutils [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.158 187212 DEBUG oslo_concurrency.lockutils [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.158 187212 DEBUG oslo_concurrency.lockutils [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.159 187212 DEBUG nova.compute.manager [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] No waiting events found dispatching network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.159 187212 WARNING nova.compute.manager [req-4923a5ca-3ad3-4c68-82ef-06498dc763bb req-8561cbbb-84ba-4948-916c-9634d35e5869 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received unexpected event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 for instance with vm_state building and task_state deleting.#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.160 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:03:28 np0005546909 podman[220673]: 2025-12-05 12:03:28.161736857 +0000 UTC m=+0.352097709 container start 80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.164 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936208.1644201, 478fa005-452c-4e37-a919-63bb734a3c5c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.164 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.166 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.170 187212 INFO nova.virt.libvirt.driver [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance spawned successfully.#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.170 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:03:28 np0005546909 neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c[220717]: [NOTICE]   (220738) : New worker (220740) forked
Dec  5 07:03:28 np0005546909 neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c[220717]: [NOTICE]   (220738) : Loading success.
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.199 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.208 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: deleting, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.213 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.213 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.214 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.214 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.214 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.215 187212 DEBUG nova.virt.libvirt.driver [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.235 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.353 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.435 187212 INFO nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Took 13.63 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.436 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:28 np0005546909 podman[220687]: 2025-12-05 12:03:28.44158973 +0000 UTC m=+0.416213669 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.483 187212 DEBUG nova.compute.utils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Conflict updating instance 478fa005-452c-4e37-a919-63bb734a3c5c. Expected: {'task_state': ['spawning']}. Actual: {'task_state': 'deleting'} notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.484 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance disappeared during build. _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2483#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.484 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.485 187212 DEBUG nova.virt.libvirt.vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1018127156',display_name='tempest-DeleteServersTestJSON-server-1018127156',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1018127156',id=34,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=2025-12-05T12:03:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-oru1ft1x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state=None,terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:17Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=478fa005-452c-4e37-a919-63bb734a3c5c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.485 187212 DEBUG nova.network.os_vif_util [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "7022c257-2ab5-436e-9757-387e9de66b18", "address": "fa:16:3e:57:8e:0e", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7022c257-2a", "ovs_interfaceid": "7022c257-2ab5-436e-9757-387e9de66b18", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.486 187212 DEBUG nova.network.os_vif_util [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.486 187212 DEBUG os_vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.488 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.488 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7022c257-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.540 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:28 np0005546909 kernel: tap7022c257-2a: left promiscuous mode
Dec  5 07:03:28 np0005546909 NetworkManager[55691]: <info>  [1764936208.5414] device (tap7022c257-2a): state change: disconnected -> unmanaged (reason 'unmanaged-external-down', managed-type: 'external')
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.542 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:28 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:28Z|00274|binding|INFO|Releasing lport 7022c257-2ab5-436e-9757-387e9de66b18 from this chassis (sb_readonly=0)
Dec  5 07:03:28 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:28Z|00275|binding|INFO|Setting lport 7022c257-2ab5-436e-9757-387e9de66b18 down in Southbound
Dec  5 07:03:28 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:28.560 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:8e:0e 10.100.0.4'], port_security=['fa:16:3e:57:8e:0e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '478fa005-452c-4e37-a919-63bb734a3c5c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7022c257-2ab5-436e-9757-387e9de66b18) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:28 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:28.561 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7022c257-2ab5-436e-9757-387e9de66b18 in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 unbound from our chassis#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.563 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:28 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:28.565 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:03:28 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:28.566 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3822956a-f178-469e-964b-c878e09d3965]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:28 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:28.567 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace which is not needed anymore#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.572 187212 INFO os_vif [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:57:8e:0e,bridge_name='br-int',has_traffic_filtering=True,id=7022c257-2ab5-436e-9757-387e9de66b18,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7022c257-2a')#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.573 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.573 187212 DEBUG nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:03:28 np0005546909 nova_compute[187208]: 2025-12-05 12:03:28.573 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:03:28 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [NOTICE]   (220467) : haproxy version is 2.8.14-c23fe91
Dec  5 07:03:28 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [NOTICE]   (220467) : path to executable is /usr/sbin/haproxy
Dec  5 07:03:28 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [WARNING]  (220467) : Exiting Master process...
Dec  5 07:03:28 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [ALERT]    (220467) : Current worker (220469) exited with code 143 (Terminated)
Dec  5 07:03:28 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[220463]: [WARNING]  (220467) : All workers exited. Exiting... (0)
Dec  5 07:03:28 np0005546909 systemd[1]: libpod-8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43.scope: Deactivated successfully.
Dec  5 07:03:28 np0005546909 podman[220770]: 2025-12-05 12:03:28.751327037 +0000 UTC m=+0.100849131 container died 8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:03:28 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43-userdata-shm.mount: Deactivated successfully.
Dec  5 07:03:28 np0005546909 systemd[1]: var-lib-containers-storage-overlay-1ded9da6901c69416b75c147b588f572cf065df3fa0a5ca976e39b2a29f8769f-merged.mount: Deactivated successfully.
Dec  5 07:03:29 np0005546909 podman[220770]: 2025-12-05 12:03:29.095076996 +0000 UTC m=+0.444599080 container cleanup 8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Dec  5 07:03:29 np0005546909 systemd[1]: libpod-conmon-8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43.scope: Deactivated successfully.
Dec  5 07:03:29 np0005546909 nova_compute[187208]: 2025-12-05 12:03:29.215 187212 DEBUG nova.network.neutron [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Updated VIF entry in instance network info cache for port a35b6b13-07bc-4c91-aaf5-231163a6ea44. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:03:29 np0005546909 nova_compute[187208]: 2025-12-05 12:03:29.216 187212 DEBUG nova.network.neutron [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Updating instance_info_cache with network_info: [{"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:29 np0005546909 nova_compute[187208]: 2025-12-05 12:03:29.239 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:29 np0005546909 nova_compute[187208]: 2025-12-05 12:03:29.240 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received event network-changed-ecec1a41-6f3e-4852-8cdb-9d461eded987 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:29 np0005546909 nova_compute[187208]: 2025-12-05 12:03:29.240 187212 DEBUG nova.compute.manager [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Refreshing instance network info cache due to event network-changed-ecec1a41-6f3e-4852-8cdb-9d461eded987. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:03:29 np0005546909 nova_compute[187208]: 2025-12-05 12:03:29.240 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-c1e2f189-1777-4f28-97ab-72cf0f60fbc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:29 np0005546909 nova_compute[187208]: 2025-12-05 12:03:29.240 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-c1e2f189-1777-4f28-97ab-72cf0f60fbc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:29 np0005546909 nova_compute[187208]: 2025-12-05 12:03:29.241 187212 DEBUG nova.network.neutron [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Refreshing network info cache for port ecec1a41-6f3e-4852-8cdb-9d461eded987 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:03:29 np0005546909 podman[220798]: 2025-12-05 12:03:29.27623156 +0000 UTC m=+0.159511067 container remove 8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:03:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.281 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[00583953-3966-435e-aa11-965cc7e7ee23]: (4, ('Fri Dec  5 12:03:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43)\n8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43\nFri Dec  5 12:03:29 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43)\n8347d51b90550a0a790d3e8371a49d1c61975c341c8501b220a5f88269c7fd43\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.283 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7dfbcd26-62e0-4d4f-be68-513e59792677]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.284 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:29 np0005546909 kernel: tapd7360f84-b0: left promiscuous mode
Dec  5 07:03:29 np0005546909 nova_compute[187208]: 2025-12-05 12:03:29.286 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.291 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e35841d1-cfa7-4bda-9d48-9e4fbf52c36b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.311 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[596c5fd6-66c5-4868-8e51-540b8b38aaf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.312 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4753a636-ad36-484c-afed-31cb4e008607]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:29 np0005546909 nova_compute[187208]: 2025-12-05 12:03:29.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.334 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c06cf931-eebe-4916-be15-15dfabc21fa8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358249, 'reachable_time': 28551, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220811, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:29 np0005546909 systemd[1]: run-netns-ovnmeta\x2dd7360f84\x2dbcd5\x2d4e64\x2dbf43\x2d1fdbd8215a70.mount: Deactivated successfully.
Dec  5 07:03:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.337 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:03:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:29.337 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b6497517-2e20-438f-9314-121115552c5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.263 187212 DEBUG nova.network.neutron [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.286 187212 INFO nova.compute.manager [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Took 1.71 seconds to deallocate network for instance.#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.429 187212 INFO nova.scheduler.client.report [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Deleted allocations for instance 478fa005-452c-4e37-a919-63bb734a3c5c#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.430 187212 DEBUG oslo_concurrency.lockutils [None req-071e9840-e92c-415d-8fc7-9fc44ccdeeb9 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.430 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 13.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.430 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.430 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.431 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.432 187212 INFO nova.compute.manager [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Terminating instance#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.432 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.432 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquired lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.433 187212 DEBUG nova.network.neutron [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.612 187212 DEBUG nova.network.neutron [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.963 187212 DEBUG nova.network.neutron [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Updated VIF entry in instance network info cache for port ecec1a41-6f3e-4852-8cdb-9d461eded987. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.964 187212 DEBUG nova.network.neutron [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Updating instance_info_cache with network_info: [{"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:30 np0005546909 nova_compute[187208]: 2025-12-05 12:03:30.984 187212 DEBUG oslo_concurrency.lockutils [req-8876f70b-585a-4fe1-afe5-9136097756f8 req-5ac1f820-e0e3-41f3-8ad3-13577a5e2716 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-c1e2f189-1777-4f28-97ab-72cf0f60fbc0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.046 187212 DEBUG nova.network.neutron [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.061 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Releasing lock "refresh_cache-478fa005-452c-4e37-a919-63bb734a3c5c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.062 187212 DEBUG nova.compute.manager [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:03:31 np0005546909 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000022.scope: Deactivated successfully.
Dec  5 07:03:31 np0005546909 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000022.scope: Consumed 3.165s CPU time.
Dec  5 07:03:31 np0005546909 systemd-machined[153543]: Machine qemu-38-instance-00000022 terminated.
Dec  5 07:03:31 np0005546909 podman[220816]: 2025-12-05 12:03:31.232005043 +0000 UTC m=+0.083820915 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.338 187212 INFO nova.virt.libvirt.driver [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance destroyed successfully.#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.339 187212 DEBUG nova.objects.instance [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'resources' on Instance uuid 478fa005-452c-4e37-a919-63bb734a3c5c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.359 187212 INFO nova.virt.libvirt.driver [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Deleting instance files /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c_del#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.360 187212 INFO nova.virt.libvirt.driver [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Deletion of /var/lib/nova/instances/478fa005-452c-4e37-a919-63bb734a3c5c_del complete#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.416 187212 INFO nova.compute.manager [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.417 187212 DEBUG oslo.service.loopingcall [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.417 187212 DEBUG nova.compute.manager [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.418 187212 DEBUG nova.network.neutron [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.757 187212 DEBUG nova.network.neutron [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.772 187212 DEBUG nova.network.neutron [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.787 187212 INFO nova.compute.manager [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Took 0.37 seconds to deallocate network for instance.#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.834 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:31 np0005546909 nova_compute[187208]: 2025-12-05 12:03:31.836 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:32 np0005546909 nova_compute[187208]: 2025-12-05 12:03:32.039 187212 DEBUG nova.compute.provider_tree [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:32 np0005546909 nova_compute[187208]: 2025-12-05 12:03:32.055 187212 DEBUG nova.scheduler.client.report [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:32 np0005546909 nova_compute[187208]: 2025-12-05 12:03:32.078 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:32 np0005546909 nova_compute[187208]: 2025-12-05 12:03:32.119 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936197.1187773, bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:32 np0005546909 nova_compute[187208]: 2025-12-05 12:03:32.120 187212 INFO nova.compute.manager [-] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:03:32 np0005546909 nova_compute[187208]: 2025-12-05 12:03:32.145 187212 DEBUG nova.compute.manager [None req-d0df733a-df5c-4792-bc1b-e2b262df4072 - - - - - -] [instance: bf9b57b1-1e38-43fd-9a0a-9dca1d0f5b77] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:32 np0005546909 nova_compute[187208]: 2025-12-05 12:03:32.173 187212 DEBUG oslo_concurrency.lockutils [None req-258a947e-501d-4d91-8cd7-7ba4775ed6cd ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:32 np0005546909 nova_compute[187208]: 2025-12-05 12:03:32.552 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.095 187212 DEBUG nova.compute.manager [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.095 187212 DEBUG oslo_concurrency.lockutils [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.096 187212 DEBUG oslo_concurrency.lockutils [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.096 187212 DEBUG oslo_concurrency.lockutils [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.096 187212 DEBUG nova.compute.manager [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Processing event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.096 187212 DEBUG nova.compute.manager [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.097 187212 DEBUG oslo_concurrency.lockutils [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.097 187212 DEBUG oslo_concurrency.lockutils [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.097 187212 DEBUG oslo_concurrency.lockutils [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.097 187212 DEBUG nova.compute.manager [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] No waiting events found dispatching network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.098 187212 WARNING nova.compute.manager [req-ddb71c7e-1d8c-41c3-8591-d3ea8902be74 req-c3612be9-593a-4bb5-bfbe-d1ba528af1b4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received unexpected event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.099 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.100 187212 DEBUG nova.compute.manager [req-ce595d2d-9bc8-4b24-8274-0c4d85d58346 req-4f772a87-d60d-49f6-8f96-ca0a0cec236c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received event network-vif-deleted-7022c257-2ab5-436e-9757-387e9de66b18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.103 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936213.103502, 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.104 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.105 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.112 187212 INFO nova.virt.libvirt.driver [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Instance spawned successfully.#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.112 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.160 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.161 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.161 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.162 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.162 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.163 187212 DEBUG nova.virt.libvirt.driver [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.168 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.172 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.372 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.437 187212 INFO nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Took 15.85 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.438 187212 DEBUG nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.507 187212 INFO nova.compute.manager [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Took 16.34 seconds to build instance.#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.532 187212 DEBUG oslo_concurrency.lockutils [None req-310aeecc-dbc0-458a-b0d2-1a91aed1ccd1 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.540 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:33 np0005546909 nova_compute[187208]: 2025-12-05 12:03:33.964 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:34 np0005546909 nova_compute[187208]: 2025-12-05 12:03:34.654 187212 DEBUG nova.compute.manager [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:34 np0005546909 nova_compute[187208]: 2025-12-05 12:03:34.724 187212 INFO nova.compute.manager [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] instance snapshotting#033[00m
Dec  5 07:03:34 np0005546909 nova_compute[187208]: 2025-12-05 12:03:34.947 187212 INFO nova.virt.libvirt.driver [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Beginning live snapshot process#033[00m
Dec  5 07:03:35 np0005546909 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec  5 07:03:35 np0005546909 nova_compute[187208]: 2025-12-05 12:03:35.151 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:35 np0005546909 nova_compute[187208]: 2025-12-05 12:03:35.213 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:35 np0005546909 nova_compute[187208]: 2025-12-05 12:03:35.216 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:35 np0005546909 nova_compute[187208]: 2025-12-05 12:03:35.278 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json -f qcow2" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:35 np0005546909 nova_compute[187208]: 2025-12-05 12:03:35.291 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:35 np0005546909 nova_compute[187208]: 2025-12-05 12:03:35.345 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:35 np0005546909 nova_compute[187208]: 2025-12-05 12:03:35.347 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmphaqxbclj/091a7fc80df3445e9b24863e1a5d6006.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:35Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f2:70:f2 10.100.0.12
Dec  5 07:03:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:35Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:70:f2 10.100.0.12
Dec  5 07:03:35 np0005546909 nova_compute[187208]: 2025-12-05 12:03:35.796 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmphaqxbclj/091a7fc80df3445e9b24863e1a5d6006.delta 1073741824" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:35 np0005546909 nova_compute[187208]: 2025-12-05 12:03:35.798 187212 INFO nova.virt.libvirt.driver [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Dec  5 07:03:35 np0005546909 nova_compute[187208]: 2025-12-05 12:03:35.854 187212 DEBUG nova.virt.libvirt.guest [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] COPY block job progress, current cursor: 0 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:03:36 np0005546909 nova_compute[187208]: 2025-12-05 12:03:36.358 187212 DEBUG nova.virt.libvirt.guest [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] COPY block job progress, current cursor: 75431936 final cursor: 75431936 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:03:36 np0005546909 nova_compute[187208]: 2025-12-05 12:03:36.364 187212 INFO nova.virt.libvirt.driver [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Dec  5 07:03:36 np0005546909 nova_compute[187208]: 2025-12-05 12:03:36.412 187212 DEBUG nova.privsep.utils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:03:36 np0005546909 nova_compute[187208]: 2025-12-05 12:03:36.413 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmphaqxbclj/091a7fc80df3445e9b24863e1a5d6006.delta /var/lib/nova/instances/snapshots/tmphaqxbclj/091a7fc80df3445e9b24863e1a5d6006 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:36 np0005546909 nova_compute[187208]: 2025-12-05 12:03:36.977 187212 DEBUG oslo_concurrency.processutils [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmphaqxbclj/091a7fc80df3445e9b24863e1a5d6006.delta /var/lib/nova/instances/snapshots/tmphaqxbclj/091a7fc80df3445e9b24863e1a5d6006" returned: 0 in 0.564s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:36 np0005546909 nova_compute[187208]: 2025-12-05 12:03:36.985 187212 INFO nova.virt.libvirt.driver [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.102 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.131 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.132 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.132 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.154 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.155 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.156 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.156 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.156 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d70544d6-04e3-4b2a-914a-72db3052216a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.553 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.632 187212 DEBUG nova.compute.manager [req-ac037226-b8ed-464e-a0ed-b45cbca17647 req-24c5b97f-c86d-43a8-8d77-cfdf18652792 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.632 187212 DEBUG oslo_concurrency.lockutils [req-ac037226-b8ed-464e-a0ed-b45cbca17647 req-24c5b97f-c86d-43a8-8d77-cfdf18652792 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.633 187212 DEBUG oslo_concurrency.lockutils [req-ac037226-b8ed-464e-a0ed-b45cbca17647 req-24c5b97f-c86d-43a8-8d77-cfdf18652792 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.633 187212 DEBUG oslo_concurrency.lockutils [req-ac037226-b8ed-464e-a0ed-b45cbca17647 req-24c5b97f-c86d-43a8-8d77-cfdf18652792 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "478fa005-452c-4e37-a919-63bb734a3c5c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.633 187212 DEBUG nova.compute.manager [req-ac037226-b8ed-464e-a0ed-b45cbca17647 req-24c5b97f-c86d-43a8-8d77-cfdf18652792 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] No waiting events found dispatching network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.633 187212 WARNING nova.compute.manager [req-ac037226-b8ed-464e-a0ed-b45cbca17647 req-24c5b97f-c86d-43a8-8d77-cfdf18652792 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Received unexpected event network-vif-plugged-7022c257-2ab5-436e-9757-387e9de66b18 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.699 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.699 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.734 187212 DEBUG nova.compute.manager [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.734 187212 DEBUG oslo_concurrency.lockutils [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.735 187212 DEBUG oslo_concurrency.lockutils [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.735 187212 DEBUG oslo_concurrency.lockutils [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.735 187212 DEBUG nova.compute.manager [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Processing event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.736 187212 DEBUG nova.compute.manager [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.736 187212 DEBUG oslo_concurrency.lockutils [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.736 187212 DEBUG oslo_concurrency.lockutils [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.736 187212 DEBUG oslo_concurrency.lockutils [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.737 187212 DEBUG nova.compute.manager [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] No waiting events found dispatching network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.737 187212 WARNING nova.compute.manager [req-0e90e1c1-b491-412e-aa15-bfc46701c4f4 req-125a486d-1b25-43ba-93c8-4facbc6cb8d7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received unexpected event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.738 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Instance event wait completed in 11 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.739 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.743 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936217.7426016, c1e2f189-1777-4f28-97ab-72cf0f60fbc0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.745 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.757 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.761 187212 INFO nova.virt.libvirt.driver [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Instance spawned successfully.#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.761 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.774 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.778 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.793 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.794 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.795 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.795 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.796 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.797 187212 DEBUG nova.virt.libvirt.driver [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.807 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.849 187212 INFO nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Took 18.11 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.850 187212 DEBUG nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.920 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.920 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.924 187212 INFO nova.compute.manager [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Took 18.96 seconds to build instance.#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.928 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.929 187212 INFO nova.compute.claims [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:03:37 np0005546909 nova_compute[187208]: 2025-12-05 12:03:37.948 187212 DEBUG oslo_concurrency.lockutils [None req-0a72a701-11bd-4de8-9a79-62706cc768ee a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.286 187212 DEBUG nova.compute.provider_tree [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.400 187212 DEBUG nova.scheduler.client.report [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.543 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.551 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.552 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.635 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.636 187212 DEBUG nova.network.neutron [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.669 187212 INFO nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.688 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.788 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.789 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.790 187212 INFO nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Creating image(s)#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.791 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "/var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.791 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.792 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.810 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.892 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.893 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.894 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.907 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:38.911 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:38.912 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.954 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.965 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.966 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.986 187212 DEBUG nova.policy [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff425b7b04144f93a2c15e3a347fc15c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4671f6c82ea049fab3a314ecf45b7656', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:03:38 np0005546909 nova_compute[187208]: 2025-12-05 12:03:38.999 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.000 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.000 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.052 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.053 187212 DEBUG nova.virt.disk.api [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Checking if we can resize image /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.053 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.110 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:39Z|00276|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:03:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:39Z|00277|binding|INFO|Releasing lport 55380907-78ff-4f14-8b9a-7ccb714bf36a from this chassis (sb_readonly=0)
Dec  5 07:03:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:39Z|00278|binding|INFO|Releasing lport 41d0fbd3-22b2-4ee9-8c84-9f176e5ee865 from this chassis (sb_readonly=0)
Dec  5 07:03:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:39Z|00279|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.112 187212 DEBUG nova.virt.disk.api [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Cannot resize image /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.114 187212 DEBUG nova.objects.instance [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'migration_context' on Instance uuid 30a55909-059f-4a0c-9598-14cc506d42a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.134 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.146 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.146 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Ensure instance console log exists: /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.147 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.147 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.147 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:39 np0005546909 podman[220911]: 2025-12-05 12:03:39.199927553 +0000 UTC m=+0.052326936 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.271 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Updating instance_info_cache with network_info: [{"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.311 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-d70544d6-04e3-4b2a-914a-72db3052216a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.311 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.312 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.312 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.312 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.312 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.312 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.343 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.344 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.344 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.344 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:03:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:39Z|00280|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:03:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:39Z|00281|binding|INFO|Releasing lport 55380907-78ff-4f14-8b9a-7ccb714bf36a from this chassis (sb_readonly=0)
Dec  5 07:03:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:39Z|00282|binding|INFO|Releasing lport 41d0fbd3-22b2-4ee9-8c84-9f176e5ee865 from this chassis (sb_readonly=0)
Dec  5 07:03:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:39Z|00283|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.421 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.422 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.431 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.450 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.453 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.470 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.471 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.500 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.507 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.508 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.552 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.553 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.561 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.561 187212 INFO nova.compute.claims [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.573 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.579 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.650 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.651 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.723 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.730 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.798 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.800 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.829 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.873 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.874 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.874 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.875 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.875 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.877 187212 INFO nova.compute.manager [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Terminating instance#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.878 187212 DEBUG nova.compute.manager [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.879 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.886 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:39 np0005546909 kernel: tapa35b6b13-07 (unregistering): left promiscuous mode
Dec  5 07:03:39 np0005546909 NetworkManager[55691]: <info>  [1764936219.9002] device (tapa35b6b13-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:03:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:39Z|00284|binding|INFO|Releasing lport a35b6b13-07bc-4c91-aaf5-231163a6ea44 from this chassis (sb_readonly=0)
Dec  5 07:03:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:39Z|00285|binding|INFO|Setting lport a35b6b13-07bc-4c91-aaf5-231163a6ea44 down in Southbound
Dec  5 07:03:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:39Z|00286|binding|INFO|Removing iface tapa35b6b13-07 ovn-installed in OVS
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.923 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:39.921 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:a5:f2 10.100.0.10'], port_security=['fa:16:3e:91:a5:f2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '05008cd8-8cac-482b-9ff8-68f2f0aaa6d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5223579-477c-4fbe-a58c-2e56f428541c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70409a2f9710408cb377a61250853fbd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cfbac9bb-a6fa-4e30-b1e0-c07877ef21de', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85bd0838-1b85-4e8e-bf67-d21df8aa9251, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=a35b6b13-07bc-4c91-aaf5-231163a6ea44) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:39.923 104471 INFO neutron.agent.ovn.metadata.agent [-] Port a35b6b13-07bc-4c91-aaf5-231163a6ea44 in datapath f5223579-477c-4fbe-a58c-2e56f428541c unbound from our chassis#033[00m
Dec  5 07:03:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:39.925 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5223579-477c-4fbe-a58c-2e56f428541c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:03:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:39.926 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0d259fab-54dc-43c4-86ec-2a1e2ed40565]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:39.927 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c namespace which is not needed anymore#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.937 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:39 np0005546909 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000023.scope: Deactivated successfully.
Dec  5 07:03:39 np0005546909 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000023.scope: Consumed 7.231s CPU time.
Dec  5 07:03:39 np0005546909 systemd-machined[153543]: Machine qemu-39-instance-00000023 terminated.
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.966 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:39 np0005546909 nova_compute[187208]: 2025-12-05 12:03:39.966 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.005 187212 DEBUG nova.network.neutron [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Successfully created port: 9dc35efb-0aed-463b-860e-3b60dd65b6db _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.042 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.105 187212 DEBUG nova.compute.provider_tree [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.111 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.131 187212 DEBUG nova.scheduler.client.report [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.148 187212 INFO nova.virt.libvirt.driver [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Instance destroyed successfully.#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.149 187212 DEBUG nova.objects.instance [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lazy-loading 'resources' on Instance uuid 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.192 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.193 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.197 187212 DEBUG nova.virt.libvirt.vif [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-InstanceActionsNegativeTestJSON-server-378209780',display_name='tempest-InstanceActionsNegativeTestJSON-server-378209780',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-instanceactionsnegativetestjson-server-378209780',id=35,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70409a2f9710408cb377a61250853fbd',ramdisk_id='',reservation_id='r-1rqpjj9v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsNegativeTestJSON-1806311246',owner_user_name='tempest-InstanceActionsNegativeTestJSON-1806311246-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:33Z,user_data=None,user_id='18569d5748e8448fbd1bcbf5d37ff5f6',uuid=05008cd8-8cac-482b-9ff8-68f2f0aaa6d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.197 187212 DEBUG nova.network.os_vif_util [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Converting VIF {"id": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "address": "fa:16:3e:91:a5:f2", "network": {"id": "f5223579-477c-4fbe-a58c-2e56f428541c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1983905943-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70409a2f9710408cb377a61250853fbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa35b6b13-07", "ovs_interfaceid": "a35b6b13-07bc-4c91-aaf5-231163a6ea44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.198 187212 DEBUG nova.network.os_vif_util [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.198 187212 DEBUG os_vif [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.200 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.200 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa35b6b13-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.201 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.202 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.205 187212 INFO os_vif [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:a5:f2,bridge_name='br-int',has_traffic_filtering=True,id=a35b6b13-07bc-4c91-aaf5-231163a6ea44,network=Network(f5223579-477c-4fbe-a58c-2e56f428541c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa35b6b13-07')#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.206 187212 INFO nova.virt.libvirt.driver [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Deleting instance files /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4_del#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.206 187212 INFO nova.virt.libvirt.driver [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Deletion of /var/lib/nova/instances/05008cd8-8cac-482b-9ff8-68f2f0aaa6d4_del complete#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.211 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.212 187212 INFO nova.compute.claims [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.258 187212 INFO nova.virt.libvirt.driver [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Snapshot image upload complete#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.258 187212 INFO nova.compute.manager [None req-18f3ef8a-a2e0-4e3c-b9a6-6a6181e89ef8 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Took 5.53 seconds to snapshot the instance on the hypervisor.#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.307 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.308 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5040MB free_disk=73.18467330932617GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.308 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.328 187212 INFO nova.compute.manager [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.329 187212 DEBUG oslo.service.loopingcall [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.329 187212 DEBUG nova.compute.manager [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.329 187212 DEBUG nova.network.neutron [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.337 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.338 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.366 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:03:40 np0005546909 neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c[220717]: [NOTICE]   (220738) : haproxy version is 2.8.14-c23fe91
Dec  5 07:03:40 np0005546909 neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c[220717]: [NOTICE]   (220738) : path to executable is /usr/sbin/haproxy
Dec  5 07:03:40 np0005546909 neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c[220717]: [ALERT]    (220738) : Current worker (220740) exited with code 143 (Terminated)
Dec  5 07:03:40 np0005546909 neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c[220717]: [WARNING]  (220738) : All workers exited. Exiting... (0)
Dec  5 07:03:40 np0005546909 systemd[1]: libpod-80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a.scope: Deactivated successfully.
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.400 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:03:40 np0005546909 podman[220980]: 2025-12-05 12:03:40.405936301 +0000 UTC m=+0.382790245 container died 80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:03:40 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a-userdata-shm.mount: Deactivated successfully.
Dec  5 07:03:40 np0005546909 systemd[1]: var-lib-containers-storage-overlay-78756f8a0c787d9c7273ba58bfae472f50372e3fa267a0cd0bfee40b23f73738-merged.mount: Deactivated successfully.
Dec  5 07:03:40 np0005546909 podman[220980]: 2025-12-05 12:03:40.513349889 +0000 UTC m=+0.490203813 container cleanup 80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 07:03:40 np0005546909 systemd[1]: libpod-conmon-80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a.scope: Deactivated successfully.
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.574 187212 DEBUG nova.compute.provider_tree [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.577 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.578 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.579 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Creating image(s)#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.579 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "/var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.579 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.580 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.594 187212 DEBUG nova.scheduler.client.report [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.597 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.619 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.620 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.626 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.656 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.657 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.658 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:40 np0005546909 podman[221030]: 2025-12-05 12:03:40.66780683 +0000 UTC m=+0.128194042 container remove 80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.671 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.672 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7308ce7d-08be-4944-bff3-e47d1d8ad6f6]: (4, ('Fri Dec  5 12:03:40 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c (80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a)\n80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a\nFri Dec  5 12:03:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c (80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a)\n80cffa14bd3c350d1fccd9b37cc74267fb7ec94059b7f1b7822353a5f3f0f87a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.674 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c644b93-f774-4a26-bfbb-6aea6d819342]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.675 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5223579-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:40 np0005546909 kernel: tapf5223579-40: left promiscuous mode
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.714 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.717 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:03:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.717 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[448c27ca-3ea1-4dfe-8a88-78bf2efac067]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.718 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:03:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.734 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f66561ae-cfde-4319-8f5b-8826b0c0a6f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.736 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43ddc131-0654-416a-82a9-c165e82307b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.749 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.753 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.754 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.758 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ea0d991f-f7c5-41cc-8b92-cc391bf30283]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358778, 'reachable_time': 43445, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221051, 'error': None, 'target': 'ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.762 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f5223579-477c-4fbe-a58c-2e56f428541c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:03:40 np0005546909 systemd[1]: run-netns-ovnmeta\x2df5223579\x2d477c\x2d4fbe\x2da58c\x2d2e56f428541c.mount: Deactivated successfully.
Dec  5 07:03:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:40.762 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7f321cef-74d3-4c0e-b552-c84dcdcc94e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.782 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.787 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance d70544d6-04e3-4b2a-914a-72db3052216a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.788 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance d2085dd9-2ebd-4804-99c1-3b15cbd216f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.788 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.788 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance c1e2f189-1777-4f28-97ab-72cf0f60fbc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.788 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 30a55909-059f-4a0c-9598-14cc506d42a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.789 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.789 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 456f1972-6ed7-4fc2-b046-fa035704d434 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.789 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 7 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.790 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1408MB phys_disk=79GB used_disk=7GB total_vcpus=8 used_vcpus=7 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.815 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk 1073741824" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.817 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.817 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.875 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.877 187212 DEBUG nova.virt.disk.api [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Checking if we can resize image /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.877 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.901 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.903 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.903 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Creating image(s)#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.904 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "/var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.904 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.905 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.920 187212 DEBUG nova.policy [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40620135b1ff4f8d9d80eb79f51fd593', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bebbbd9623064681bb9350747fba600e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.925 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.953 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.954 187212 DEBUG nova.virt.disk.api [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Cannot resize image /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.954 187212 DEBUG nova.objects.instance [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'migration_context' on Instance uuid 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.970 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.970 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Ensure instance console log exists: /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.971 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.971 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.971 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.978 187212 DEBUG nova.policy [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40620135b1ff4f8d9d80eb79f51fd593', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bebbbd9623064681bb9350747fba600e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.989 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.990 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:40 np0005546909 nova_compute[187208]: 2025-12-05 12:03:40.990 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.002 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.062 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.063 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.093 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.119 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.163 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.164 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.165 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.187 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk 1073741824" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.188 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.189 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.253 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.254 187212 DEBUG nova.virt.disk.api [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Checking if we can resize image /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.254 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.271 187212 DEBUG nova.network.neutron [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Successfully updated port: 9dc35efb-0aed-463b-860e-3b60dd65b6db _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.286 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "refresh_cache-30a55909-059f-4a0c-9598-14cc506d42a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.287 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquired lock "refresh_cache-30a55909-059f-4a0c-9598-14cc506d42a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.287 187212 DEBUG nova.network.neutron [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.315 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.316 187212 DEBUG nova.virt.disk.api [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Cannot resize image /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.316 187212 DEBUG nova.objects.instance [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'migration_context' on Instance uuid 456f1972-6ed7-4fc2-b046-fa035704d434 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.335 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.335 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Ensure instance console log exists: /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.336 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.336 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.337 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:41.915 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.921 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.923 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.951 187212 WARNING nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] While synchronizing instance power states, found 7 instances in the database and 3 instances on the hypervisor.#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.951 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid d70544d6-04e3-4b2a-914a-72db3052216a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.952 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid d2085dd9-2ebd-4804-99c1-3b15cbd216f8 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.952 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.952 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid c1e2f189-1777-4f28-97ab-72cf0f60fbc0 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.953 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid 30a55909-059f-4a0c-9598-14cc506d42a2 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.953 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.953 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Triggering sync for uuid 456f1972-6ed7-4fc2-b046-fa035704d434 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.953 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.954 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "d70544d6-04e3-4b2a-914a-72db3052216a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.954 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.954 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.955 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.956 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.956 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.957 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.957 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.957 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.958 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.958 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:03:41 np0005546909 nova_compute[187208]: 2025-12-05 12:03:41.959 187212 DEBUG nova.network.neutron [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.028 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "d70544d6-04e3-4b2a-914a-72db3052216a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.029 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.029 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.075s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.098 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.203 187212 DEBUG nova.compute.manager [req-26467ffb-fe20-4d2c-995b-4ca0498dd05f req-637bdb0c-4683-411b-8b7f-5a094fee98d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-vif-unplugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.203 187212 DEBUG oslo_concurrency.lockutils [req-26467ffb-fe20-4d2c-995b-4ca0498dd05f req-637bdb0c-4683-411b-8b7f-5a094fee98d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.204 187212 DEBUG oslo_concurrency.lockutils [req-26467ffb-fe20-4d2c-995b-4ca0498dd05f req-637bdb0c-4683-411b-8b7f-5a094fee98d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.204 187212 DEBUG oslo_concurrency.lockutils [req-26467ffb-fe20-4d2c-995b-4ca0498dd05f req-637bdb0c-4683-411b-8b7f-5a094fee98d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.204 187212 DEBUG nova.compute.manager [req-26467ffb-fe20-4d2c-995b-4ca0498dd05f req-637bdb0c-4683-411b-8b7f-5a094fee98d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] No waiting events found dispatching network-vif-unplugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.205 187212 DEBUG nova.compute.manager [req-26467ffb-fe20-4d2c-995b-4ca0498dd05f req-637bdb0c-4683-411b-8b7f-5a094fee98d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-vif-unplugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.352 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Successfully created port: 4f7ea95e-e59f-4941-83b6-5c482617a975 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.556 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.954 187212 DEBUG nova.network.neutron [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:42 np0005546909 nova_compute[187208]: 2025-12-05 12:03:42.981 187212 INFO nova.compute.manager [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Took 2.65 seconds to deallocate network for instance.#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.040 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.041 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.092 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Successfully created port: 909107ba-c90a-4004-a47f-e5367cab8f82 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.173 187212 DEBUG nova.objects.instance [None req-b0b56b73-2f59-4f5c-8732-2fcb3b32ac53 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid c1e2f189-1777-4f28-97ab-72cf0f60fbc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.193 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936223.1932034, c1e2f189-1777-4f28-97ab-72cf0f60fbc0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.193 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.216 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.220 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.246 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.259 187212 DEBUG nova.compute.provider_tree [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.275 187212 DEBUG nova.scheduler.client.report [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.308 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.349 187212 INFO nova.scheduler.client.report [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Deleted allocations for instance 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.633 187212 DEBUG oslo_concurrency.lockutils [None req-8d299a41-9b25-4d04-bf4e-de691fa3016c 18569d5748e8448fbd1bcbf5d37ff5f6 70409a2f9710408cb377a61250853fbd - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.634 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 1.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.635 187212 INFO nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] During sync_power_state the instance has a pending task (deleting). Skip.#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.635 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:43 np0005546909 kernel: tapecec1a41-6f (unregistering): left promiscuous mode
Dec  5 07:03:43 np0005546909 NetworkManager[55691]: <info>  [1764936223.8128] device (tapecec1a41-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.827 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:43 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:43Z|00287|binding|INFO|Releasing lport ecec1a41-6f3e-4852-8cdb-9d461eded987 from this chassis (sb_readonly=0)
Dec  5 07:03:43 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:43Z|00288|binding|INFO|Setting lport ecec1a41-6f3e-4852-8cdb-9d461eded987 down in Southbound
Dec  5 07:03:43 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:43Z|00289|binding|INFO|Removing iface tapecec1a41-6f ovn-installed in OVS
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.829 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:43 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:43Z|00290|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:03:43 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:43Z|00291|binding|INFO|Releasing lport 55380907-78ff-4f14-8b9a-7ccb714bf36a from this chassis (sb_readonly=0)
Dec  5 07:03:43 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:43Z|00292|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec  5 07:03:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:43.836 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:57:88:7f 10.100.0.5'], port_security=['fa:16:3e:57:88:7f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c1e2f189-1777-4f28-97ab-72cf0f60fbc0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ecec1a41-6f3e-4852-8cdb-9d461eded987) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:43.839 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ecec1a41-6f3e-4852-8cdb-9d461eded987 in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis#033[00m
Dec  5 07:03:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:43.844 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:03:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:43.845 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4012423f-77e3-4df9-9d3e-78146d2512bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:43.846 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace which is not needed anymore#033[00m
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.846 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:43 np0005546909 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000024.scope: Deactivated successfully.
Dec  5 07:03:43 np0005546909 nova_compute[187208]: 2025-12-05 12:03:43.923 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:43 np0005546909 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000024.scope: Consumed 5.842s CPU time.
Dec  5 07:03:43 np0005546909 systemd-machined[153543]: Machine qemu-40-instance-00000024 terminated.
Dec  5 07:03:43 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [NOTICE]   (220617) : haproxy version is 2.8.14-c23fe91
Dec  5 07:03:43 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [NOTICE]   (220617) : path to executable is /usr/sbin/haproxy
Dec  5 07:03:43 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [WARNING]  (220617) : Exiting Master process...
Dec  5 07:03:43 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [ALERT]    (220617) : Current worker (220619) exited with code 143 (Terminated)
Dec  5 07:03:43 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[220613]: [WARNING]  (220617) : All workers exited. Exiting... (0)
Dec  5 07:03:43 np0005546909 systemd[1]: libpod-f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7.scope: Deactivated successfully.
Dec  5 07:03:43 np0005546909 conmon[220613]: conmon f442b1005a00d2fb0330 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7.scope/container/memory.events
Dec  5 07:03:43 np0005546909 podman[221102]: 2025-12-05 12:03:43.997321343 +0000 UTC m=+0.054859928 container died f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 07:03:44 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7-userdata-shm.mount: Deactivated successfully.
Dec  5 07:03:44 np0005546909 systemd[1]: var-lib-containers-storage-overlay-d1e943a749671a1b77c4cdf6fa31b99e20bf9ab4dc014c673f02b47bd056cb36-merged.mount: Deactivated successfully.
Dec  5 07:03:44 np0005546909 podman[221102]: 2025-12-05 12:03:44.03994368 +0000 UTC m=+0.097482265 container cleanup f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec  5 07:03:44 np0005546909 systemd[1]: libpod-conmon-f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7.scope: Deactivated successfully.
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.050 187212 DEBUG nova.compute.manager [None req-b0b56b73-2f59-4f5c-8732-2fcb3b32ac53 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.093 187212 DEBUG nova.network.neutron [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Updating instance_info_cache with network_info: [{"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:44 np0005546909 podman[221145]: 2025-12-05 12:03:44.109078005 +0000 UTC m=+0.045977014 container remove f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:03:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.116 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d23ea96e-8102-409b-b839-6aef03ee9807]: (4, ('Fri Dec  5 12:03:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7)\nf442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7\nFri Dec  5 12:03:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (f442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7)\nf442b1005a00d2fb0330e6863878a4c107debc5b430f172b50b0f83266f698c7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.118 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[20dac5c5-dc4a-4096-a0ac-f40387f31190]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.120 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.151 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:44 np0005546909 kernel: tap41b3b495-c0: left promiscuous mode
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.154 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Releasing lock "refresh_cache-30a55909-059f-4a0c-9598-14cc506d42a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.154 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Instance network_info: |[{"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.157 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Start _get_guest_xml network_info=[{"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.161 187212 WARNING nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.166 187212 DEBUG nova.virt.libvirt.host [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.166 187212 DEBUG nova.virt.libvirt.host [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.168 187212 DEBUG nova.virt.libvirt.host [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.169 187212 DEBUG nova.virt.libvirt.host [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.169 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.169 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.170 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.170 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.170 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.170 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.171 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:03:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.170 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[827e9e7e-e0b3-4fe9-b310-c01a1fee7d30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.171 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.171 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.171 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.171 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.171 187212 DEBUG nova.virt.hardware [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.175 187212 DEBUG nova.virt.libvirt.vif [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1329126976',display_name='tempest-DeleteServersTestJSON-server-1329126976',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1329126976',id=37,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-4bgg3k4a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:38Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=30a55909-059f-4a0c-9598-14cc506d42a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.177 187212 DEBUG nova.network.os_vif_util [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.178 187212 DEBUG nova.network.os_vif_util [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.179 187212 DEBUG nova.objects.instance [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'pci_devices' on Instance uuid 30a55909-059f-4a0c-9598-14cc506d42a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.192 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eb888422-31ab-4f79-8b34-f3d2ddd89f34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.194 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[318cb6ed-80c2-493c-a3e6-2d6b7c2b16af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.207 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d5e2322b-4945-47f0-baa1-aecc03892359]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358648, 'reachable_time': 16458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221164, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.209 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  <uuid>30a55909-059f-4a0c-9598-14cc506d42a2</uuid>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  <name>instance-00000025</name>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <nova:name>tempest-DeleteServersTestJSON-server-1329126976</nova:name>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:03:44</nova:creationTime>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:        <nova:user uuid="ff425b7b04144f93a2c15e3a347fc15c">tempest-DeleteServersTestJSON-554028480-project-member</nova:user>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:        <nova:project uuid="4671f6c82ea049fab3a314ecf45b7656">tempest-DeleteServersTestJSON-554028480</nova:project>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:        <nova:port uuid="9dc35efb-0aed-463b-860e-3b60dd65b6db">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <entry name="serial">30a55909-059f-4a0c-9598-14cc506d42a2</entry>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <entry name="uuid">30a55909-059f-4a0c-9598-14cc506d42a2</entry>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.config"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:4b:04:08"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <target dev="tap9dc35efb-0a"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/console.log" append="off"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:03:44 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:03:44 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:03:44 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:03:44 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.210 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Preparing to wait for external event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:03:44 np0005546909 systemd[1]: run-netns-ovnmeta\x2d41b3b495\x2dc1c9\x2d44c0\x2db1a3\x2da499df6548dd.mount: Deactivated successfully.
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.210 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.211 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.211 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.211 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:03:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:44.211 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[0422cdce-9c89-44fa-a2ce-e4e5321b75a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.212 187212 DEBUG nova.virt.libvirt.vif [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1329126976',display_name='tempest-DeleteServersTestJSON-server-1329126976',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1329126976',id=37,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-4bgg3k4a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:38Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=30a55909-059f-4a0c-9598-14cc506d42a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.212 187212 DEBUG nova.network.os_vif_util [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.213 187212 DEBUG nova.network.os_vif_util [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.213 187212 DEBUG os_vif [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.214 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.214 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.215 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.217 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.218 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9dc35efb-0a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.218 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9dc35efb-0a, col_values=(('external_ids', {'iface-id': '9dc35efb-0aed-463b-860e-3b60dd65b6db', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4b:04:08', 'vm-uuid': '30a55909-059f-4a0c-9598-14cc506d42a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.219 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:44 np0005546909 NetworkManager[55691]: <info>  [1764936224.2205] manager: (tap9dc35efb-0a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.221 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.227 187212 INFO os_vif [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a')#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.285 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Successfully updated port: 4f7ea95e-e59f-4941-83b6-5c482617a975 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.293 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.294 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.294 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No VIF found with MAC fa:16:3e:4b:04:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.295 187212 INFO nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Using config drive#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.321 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "refresh_cache-456f1972-6ed7-4fc2-b046-fa035704d434" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.322 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquired lock "refresh_cache-456f1972-6ed7-4fc2-b046-fa035704d434" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.322 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.890 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.894 187212 INFO nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Creating config drive at /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.config#033[00m
Dec  5 07:03:44 np0005546909 nova_compute[187208]: 2025-12-05 12:03:44.898 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkkhpjpk3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.027 187212 DEBUG oslo_concurrency.processutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkkhpjpk3" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:45 np0005546909 kernel: tap9dc35efb-0a: entered promiscuous mode
Dec  5 07:03:45 np0005546909 NetworkManager[55691]: <info>  [1764936225.0836] manager: (tap9dc35efb-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/124)
Dec  5 07:03:45 np0005546909 systemd-udevd[221081]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:03:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:45Z|00293|binding|INFO|Claiming lport 9dc35efb-0aed-463b-860e-3b60dd65b6db for this chassis.
Dec  5 07:03:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:45Z|00294|binding|INFO|9dc35efb-0aed-463b-860e-3b60dd65b6db: Claiming fa:16:3e:4b:04:08 10.100.0.4
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.087 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:45 np0005546909 NetworkManager[55691]: <info>  [1764936225.0957] device (tap9dc35efb-0a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:03:45 np0005546909 NetworkManager[55691]: <info>  [1764936225.0964] device (tap9dc35efb-0a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.098 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:04:08 10.100.0.4'], port_security=['fa:16:3e:4b:04:08 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '30a55909-059f-4a0c-9598-14cc506d42a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9dc35efb-0aed-463b-860e-3b60dd65b6db) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.099 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9dc35efb-0aed-463b-860e-3b60dd65b6db in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 bound to our chassis#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.101 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.111 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[980db7ab-6b24-4be6-920e-a14d2d29a6e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.112 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7360f84-b1 in ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.114 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7360f84-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.115 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[64a1fe69-462b-4864-8719-523a5eec5567]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.115 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a25dfc-07dc-4139-8785-ce3352ed2f45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 systemd-machined[153543]: New machine qemu-41-instance-00000025.
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.125 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[31232b55-7860-4c87-b173-cfb48a7d7174]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.142 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2cdbd0-0e70-482b-beef-c530bfb84535]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 systemd[1]: Started Virtual Machine qemu-41-instance-00000025.
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.148 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:45Z|00295|binding|INFO|Setting lport 9dc35efb-0aed-463b-860e-3b60dd65b6db ovn-installed in OVS
Dec  5 07:03:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:45Z|00296|binding|INFO|Setting lport 9dc35efb-0aed-463b-860e-3b60dd65b6db up in Southbound
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.153 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.175 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[650d20d9-917a-4bdd-8206-65b295d802e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 NetworkManager[55691]: <info>  [1764936225.1814] manager: (tapd7360f84-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/125)
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.181 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0b158d46-b534-4650-a3d2-640ea9e9d210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.205 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[11e6e5e2-c431-48f6-bc33-fabff5ba4ca7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.208 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4b1409eb-5815-4fbc-a398-92315902ce8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 NetworkManager[55691]: <info>  [1764936225.2256] device (tapd7360f84-b0): carrier: link connected
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.226 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Successfully updated port: 909107ba-c90a-4004-a47f-e5367cab8f82 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.231 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bcc22804-20a9-4b9e-bb0b-1ac01660406f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.249 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "refresh_cache-7df02f69-ecc9-424d-82ab-dc8ba279ffd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.249 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquired lock "refresh_cache-7df02f69-ecc9-424d-82ab-dc8ba279ffd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.249 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.249 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe5de8d-e5bd-4d94-89a4-ed89dbd62eb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360578, 'reachable_time': 17423, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221222, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.267 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[664b1c2c-6f5f-466c-b0d6-69c1701bf117]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:2b52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360578, 'tstamp': 360578}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221223, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.286 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8f5aacd4-dc9c-472d-bc50-4103b6e88ce2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 81], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360578, 'reachable_time': 17423, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221224, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.311 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7187502a-e780-4b59-92fa-0491ee13ab89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.329 187212 DEBUG nova.compute.manager [req-ffcd4f2d-edfd-4e05-90e4-3158665366d9 req-943fe852-3c6d-479a-bb3a-c191f31af3ab 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received event network-vif-unplugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.330 187212 DEBUG oslo_concurrency.lockutils [req-ffcd4f2d-edfd-4e05-90e4-3158665366d9 req-943fe852-3c6d-479a-bb3a-c191f31af3ab 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.330 187212 DEBUG oslo_concurrency.lockutils [req-ffcd4f2d-edfd-4e05-90e4-3158665366d9 req-943fe852-3c6d-479a-bb3a-c191f31af3ab 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.330 187212 DEBUG oslo_concurrency.lockutils [req-ffcd4f2d-edfd-4e05-90e4-3158665366d9 req-943fe852-3c6d-479a-bb3a-c191f31af3ab 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.330 187212 DEBUG nova.compute.manager [req-ffcd4f2d-edfd-4e05-90e4-3158665366d9 req-943fe852-3c6d-479a-bb3a-c191f31af3ab 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] No waiting events found dispatching network-vif-unplugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.330 187212 WARNING nova.compute.manager [req-ffcd4f2d-edfd-4e05-90e4-3158665366d9 req-943fe852-3c6d-479a-bb3a-c191f31af3ab 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received unexpected event network-vif-unplugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 for instance with vm_state suspended and task_state None.#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.376 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d401b818-c885-4508-b2aa-1f14e48a186e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.377 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.377 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.378 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7360f84-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.379 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:45 np0005546909 NetworkManager[55691]: <info>  [1764936225.3803] manager: (tapd7360f84-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Dec  5 07:03:45 np0005546909 kernel: tapd7360f84-b0: entered promiscuous mode
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.384 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.385 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7360f84-b0, col_values=(('external_ids', {'iface-id': 'd85bc323-c3ce-47e3-ac1f-d5f27467a4e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:45Z|00297|binding|INFO|Releasing lport d85bc323-c3ce-47e3-ac1f-d5f27467a4e9 from this chassis (sb_readonly=0)
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.386 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.403 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.405 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.407 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.408 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb5c847-6237-4fa0-a483-1762fce664b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.409 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:03:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:45.409 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'env', 'PROCESS_TAG=haproxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.437 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936225.4372697, 30a55909-059f-4a0c-9598-14cc506d42a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.438 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] VM Started (Lifecycle Event)#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.458 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.462 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936225.4398158, 30a55909-059f-4a0c-9598-14cc506d42a2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.462 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.480 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.483 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.502 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.591 187212 DEBUG nova.compute.manager [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.592 187212 DEBUG oslo_concurrency.lockutils [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.592 187212 DEBUG oslo_concurrency.lockutils [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.592 187212 DEBUG oslo_concurrency.lockutils [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "05008cd8-8cac-482b-9ff8-68f2f0aaa6d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.592 187212 DEBUG nova.compute.manager [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] No waiting events found dispatching network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.593 187212 WARNING nova.compute.manager [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received unexpected event network-vif-plugged-a35b6b13-07bc-4c91-aaf5-231163a6ea44 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.593 187212 DEBUG nova.compute.manager [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received event network-changed-9dc35efb-0aed-463b-860e-3b60dd65b6db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.593 187212 DEBUG nova.compute.manager [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Refreshing instance network info cache due to event network-changed-9dc35efb-0aed-463b-860e-3b60dd65b6db. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.593 187212 DEBUG oslo_concurrency.lockutils [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-30a55909-059f-4a0c-9598-14cc506d42a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.593 187212 DEBUG oslo_concurrency.lockutils [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-30a55909-059f-4a0c-9598-14cc506d42a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.593 187212 DEBUG nova.network.neutron [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Refreshing network info cache for port 9dc35efb-0aed-463b-860e-3b60dd65b6db _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:03:45 np0005546909 podman[221268]: 2025-12-05 12:03:45.773610769 +0000 UTC m=+0.052841131 container create a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:03:45 np0005546909 systemd[1]: Started libpod-conmon-a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b.scope.
Dec  5 07:03:45 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:03:45 np0005546909 podman[221268]: 2025-12-05 12:03:45.743286913 +0000 UTC m=+0.022517315 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:03:45 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aba16f9d308e86ed666c24a9528a5e9f58db6e3fb9b48c984c73f0766f63478d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:03:45 np0005546909 podman[221268]: 2025-12-05 12:03:45.857138995 +0000 UTC m=+0.136369387 container init a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec  5 07:03:45 np0005546909 podman[221268]: 2025-12-05 12:03:45.862126837 +0000 UTC m=+0.141357199 container start a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  5 07:03:45 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [NOTICE]   (221287) : New worker (221289) forked
Dec  5 07:03:45 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [NOTICE]   (221287) : Loading success.
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.935 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.943 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Updating instance_info_cache with network_info: [{"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.975 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Releasing lock "refresh_cache-456f1972-6ed7-4fc2-b046-fa035704d434" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.975 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Instance network_info: |[{"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.977 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Start _get_guest_xml network_info=[{"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.983 187212 WARNING nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.987 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.988 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.991 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.992 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.992 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.992 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.993 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.993 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.994 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.994 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.994 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.995 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.995 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.995 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.996 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:03:45 np0005546909 nova_compute[187208]: 2025-12-05 12:03:45.996 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.001 187212 DEBUG nova.virt.libvirt.vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2007104146',display_name='tempest-tempest.common.compute-instance-2007104146-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2007104146-2',id=39,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-slnm8hyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:40Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=456f1972-6ed7-4fc2-b046-fa035704d434,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.002 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.002 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.004 187212 DEBUG nova.objects.instance [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'pci_devices' on Instance uuid 456f1972-6ed7-4fc2-b046-fa035704d434 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.020 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  <uuid>456f1972-6ed7-4fc2-b046-fa035704d434</uuid>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  <name>instance-00000027</name>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <nova:name>tempest-tempest.common.compute-instance-2007104146-2</nova:name>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:03:45</nova:creationTime>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:        <nova:user uuid="40620135b1ff4f8d9d80eb79f51fd593">tempest-MultipleCreateTestJSON-1941206426-project-member</nova:user>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:        <nova:project uuid="bebbbd9623064681bb9350747fba600e">tempest-MultipleCreateTestJSON-1941206426</nova:project>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:        <nova:port uuid="4f7ea95e-e59f-4941-83b6-5c482617a975">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <entry name="serial">456f1972-6ed7-4fc2-b046-fa035704d434</entry>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <entry name="uuid">456f1972-6ed7-4fc2-b046-fa035704d434</entry>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.config"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:4a:7b:36"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <target dev="tap4f7ea95e-e5"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/console.log" append="off"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:03:46 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:03:46 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:03:46 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:03:46 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.020 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Preparing to wait for external event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.020 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.020 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.020 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.021 187212 DEBUG nova.virt.libvirt.vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2007104146',display_name='tempest-tempest.common.compute-instance-2007104146-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2007104146-2',id=39,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-slnm8hyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:40Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=456f1972-6ed7-4fc2-b046-fa035704d434,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.021 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.022 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.022 187212 DEBUG os_vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.022 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.023 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.023 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.026 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.026 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f7ea95e-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.027 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4f7ea95e-e5, col_values=(('external_ids', {'iface-id': '4f7ea95e-e59f-4941-83b6-5c482617a975', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:7b:36', 'vm-uuid': '456f1972-6ed7-4fc2-b046-fa035704d434'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:46 np0005546909 NetworkManager[55691]: <info>  [1764936226.0291] manager: (tap4f7ea95e-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.031 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.038 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.040 187212 INFO os_vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5')#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.096 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.097 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.097 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No VIF found with MAC fa:16:3e:4a:7b:36, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.097 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Using config drive#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.336 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936211.3357682, 478fa005-452c-4e37-a919-63bb734a3c5c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.337 187212 INFO nova.compute.manager [-] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.370 187212 DEBUG nova.compute.manager [None req-10a4bc21-c7db-4067-b037-182ce6d7175d - - - - - -] [instance: 478fa005-452c-4e37-a919-63bb734a3c5c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.604 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Creating config drive at /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.config#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.610 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcy3f46ov execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.738 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcy3f46ov" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:46 np0005546909 kernel: tap4f7ea95e-e5: entered promiscuous mode
Dec  5 07:03:46 np0005546909 NetworkManager[55691]: <info>  [1764936226.8018] manager: (tap4f7ea95e-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/128)
Dec  5 07:03:46 np0005546909 systemd-udevd[221211]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:03:46 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:46Z|00298|binding|INFO|Claiming lport 4f7ea95e-e59f-4941-83b6-5c482617a975 for this chassis.
Dec  5 07:03:46 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:46Z|00299|binding|INFO|4f7ea95e-e59f-4941-83b6-5c482617a975: Claiming fa:16:3e:4a:7b:36 10.100.0.3
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.806 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:46 np0005546909 NetworkManager[55691]: <info>  [1764936226.8196] device (tap4f7ea95e-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:03:46 np0005546909 NetworkManager[55691]: <info>  [1764936226.8203] device (tap4f7ea95e-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.820 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:7b:36 10.100.0.3'], port_security=['fa:16:3e:4a:7b:36 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '456f1972-6ed7-4fc2-b046-fa035704d434', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=4f7ea95e-e59f-4941-83b6-5c482617a975) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.822 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 4f7ea95e-e59f-4941-83b6-5c482617a975 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 bound to our chassis#033[00m
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.824 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36#033[00m
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.836 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ee69182f-39b2-4ec1-bcee-7071250cd57e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.837 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb8ea1ed6-91 in ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.840 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb8ea1ed6-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.841 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c190bdd2-1d4c-4804-bcb4-2a16ad31090a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.842 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[078c941d-d4fb-4d2b-b91d-ab0497c0f4f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:46 np0005546909 systemd-machined[153543]: New machine qemu-42-instance-00000027.
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.852 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[2566efdc-a622-45d6-b447-7caaabb86445]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.869 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ac8cf0e3-c0a3-4376-a026-3cbb5f1338b4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:46 np0005546909 systemd[1]: Started Virtual Machine qemu-42-instance-00000027.
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.930 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:46 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:46Z|00300|binding|INFO|Setting lport 4f7ea95e-e59f-4941-83b6-5c482617a975 ovn-installed in OVS
Dec  5 07:03:46 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:46Z|00301|binding|INFO|Setting lport 4f7ea95e-e59f-4941-83b6-5c482617a975 up in Southbound
Dec  5 07:03:46 np0005546909 nova_compute[187208]: 2025-12-05 12:03:46.935 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.955 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[72a734b6-c880-4669-abcf-1b3f30e9ec12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:46 np0005546909 systemd-udevd[221326]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:03:46 np0005546909 NetworkManager[55691]: <info>  [1764936226.9614] manager: (tapb8ea1ed6-90): new Veth device (/org/freedesktop/NetworkManager/Devices/129)
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.960 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[51fb1acb-3369-4d1b-b673-9c98767b19c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.987 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[77604098-e122-4b75-b392-c046d0f8acd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:46.991 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[318203fb-1dca-4044-a5a8-d825a4a4ec90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:47 np0005546909 NetworkManager[55691]: <info>  [1764936227.0109] device (tapb8ea1ed6-90): carrier: link connected
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.014 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[063906d6-b394-497c-842d-fe08ffa82372]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.029 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a89671aa-5a2c-43d6-8f91-79e436706594]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ea1ed6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:fb:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360757, 'reachable_time': 17404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221357, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.043 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5887dba9-d83f-48ff-a406-0b752e38bd14]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:fb51'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360757, 'tstamp': 360757}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221358, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.057 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e1db39cd-bc38-4baa-b5ba-7623132b0ec7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ea1ed6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:fb:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360757, 'reachable_time': 17404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221359, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.085 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dfec196d-e8f6-48bd-9555-ba49d1a0605f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.153 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4000348f-2c86-47f6-91de-89a57f5ef3f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.154 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ea1ed6-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.154 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.155 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8ea1ed6-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:47 np0005546909 kernel: tapb8ea1ed6-90: entered promiscuous mode
Dec  5 07:03:47 np0005546909 NetworkManager[55691]: <info>  [1764936227.1577] manager: (tapb8ea1ed6-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.156 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.165 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8ea1ed6-90, col_values=(('external_ids', {'iface-id': '6f012c31-72e4-4df5-be68-787aa910fb9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:47Z|00302|binding|INFO|Releasing lport 6f012c31-72e4-4df5-be68-787aa910fb9c from this chassis (sb_readonly=0)
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.179 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.184 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.185 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.186 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fb7f7d8d-b341-455a-9437-61ce68a1486e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.187 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36.pid.haproxy
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:03:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:47.188 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'env', 'PROCESS_TAG=haproxy-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:03:47 np0005546909 podman[221365]: 2025-12-05 12:03:47.220518627 +0000 UTC m=+0.066127330 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.351 187212 DEBUG nova.network.neutron [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Updated VIF entry in instance network info cache for port 9dc35efb-0aed-463b-860e-3b60dd65b6db. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.352 187212 DEBUG nova.network.neutron [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Updating instance_info_cache with network_info: [{"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.382 187212 DEBUG oslo_concurrency.lockutils [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-30a55909-059f-4a0c-9598-14cc506d42a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.383 187212 DEBUG nova.compute.manager [req-e1736304-d4ff-4901-a36f-ba7a4bc16aaf req-eaa12ccc-5311-403b-a8c2-06ceac0eaa39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Received event network-vif-deleted-a35b6b13-07bc-4c91-aaf5-231163a6ea44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.400 187212 DEBUG nova.network.neutron [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Updating instance_info_cache with network_info: [{"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.419 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Releasing lock "refresh_cache-7df02f69-ecc9-424d-82ab-dc8ba279ffd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.419 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Instance network_info: |[{"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.423 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Start _get_guest_xml network_info=[{"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.425 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936227.4244573, 456f1972-6ed7-4fc2-b046-fa035704d434 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.425 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] VM Started (Lifecycle Event)#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.431 187212 WARNING nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.435 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.436 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.441 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.441 187212 DEBUG nova.virt.libvirt.host [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.442 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.442 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.442 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.443 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.443 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.443 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.443 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.444 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.444 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.444 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.444 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.445 187212 DEBUG nova.virt.hardware [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.448 187212 DEBUG nova.virt.libvirt.vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2007104146',display_name='tempest-tempest.common.compute-instance-2007104146-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2007104146-1',id=38,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-slnm8hyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:40Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=7df02f69-ecc9-424d-82ab-dc8ba279ffd5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.449 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.449 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.450 187212 DEBUG nova.objects.instance [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'pci_devices' on Instance uuid 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.452 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.456 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936227.4254956, 456f1972-6ed7-4fc2-b046-fa035704d434 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.457 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.497 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.499 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  <uuid>7df02f69-ecc9-424d-82ab-dc8ba279ffd5</uuid>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  <name>instance-00000026</name>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <nova:name>tempest-tempest.common.compute-instance-2007104146-1</nova:name>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:03:47</nova:creationTime>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:        <nova:user uuid="40620135b1ff4f8d9d80eb79f51fd593">tempest-MultipleCreateTestJSON-1941206426-project-member</nova:user>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:        <nova:project uuid="bebbbd9623064681bb9350747fba600e">tempest-MultipleCreateTestJSON-1941206426</nova:project>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:        <nova:port uuid="909107ba-c90a-4004-a47f-e5367cab8f82">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <entry name="serial">7df02f69-ecc9-424d-82ab-dc8ba279ffd5</entry>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <entry name="uuid">7df02f69-ecc9-424d-82ab-dc8ba279ffd5</entry>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.config"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:16:8e:a6"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <target dev="tap909107ba-c9"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/console.log" append="off"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:03:47 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:03:47 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:03:47 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:03:47 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.500 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Preparing to wait for external event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.501 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.501 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.501 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.502 187212 DEBUG nova.virt.libvirt.vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:03:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2007104146',display_name='tempest-tempest.common.compute-instance-2007104146-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2007104146-1',id=38,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-slnm8hyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:03:40Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=7df02f69-ecc9-424d-82ab-dc8ba279ffd5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.502 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.502 187212 DEBUG nova.network.os_vif_util [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.503 187212 DEBUG os_vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.503 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.503 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.504 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.507 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.507 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap909107ba-c9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.508 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap909107ba-c9, col_values=(('external_ids', {'iface-id': '909107ba-c90a-4004-a47f-e5367cab8f82', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:16:8e:a6', 'vm-uuid': '7df02f69-ecc9-424d-82ab-dc8ba279ffd5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:47 np0005546909 NetworkManager[55691]: <info>  [1764936227.5104] manager: (tap909107ba-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.510 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.512 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.519 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.520 187212 INFO os_vif [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9')#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.533 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.556 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.575 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.575 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.575 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] No VIF found with MAC fa:16:3e:16:8e:a6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:03:47 np0005546909 nova_compute[187208]: 2025-12-05 12:03:47.575 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Using config drive#033[00m
Dec  5 07:03:47 np0005546909 podman[221423]: 2025-12-05 12:03:47.615432807 +0000 UTC m=+0.052985524 container create 9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:03:47 np0005546909 systemd[1]: Started libpod-conmon-9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1.scope.
Dec  5 07:03:47 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:03:47 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cea3cddfc7f419ead74ac9c1e8d910318a876e210e0687b74d3349159defed7c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:03:47 np0005546909 podman[221423]: 2025-12-05 12:03:47.589722213 +0000 UTC m=+0.027274910 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:03:47 np0005546909 podman[221423]: 2025-12-05 12:03:47.694831635 +0000 UTC m=+0.132384322 container init 9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  5 07:03:47 np0005546909 podman[221423]: 2025-12-05 12:03:47.70023716 +0000 UTC m=+0.137789847 container start 9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:03:47 np0005546909 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [NOTICE]   (221443) : New worker (221445) forked
Dec  5 07:03:47 np0005546909 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [NOTICE]   (221443) : Loading success.
Dec  5 07:03:48 np0005546909 nova_compute[187208]: 2025-12-05 12:03:48.186 187212 DEBUG nova.compute.manager [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-changed-4f7ea95e-e59f-4941-83b6-5c482617a975 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:48 np0005546909 nova_compute[187208]: 2025-12-05 12:03:48.186 187212 DEBUG nova.compute.manager [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Refreshing instance network info cache due to event network-changed-4f7ea95e-e59f-4941-83b6-5c482617a975. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:03:48 np0005546909 nova_compute[187208]: 2025-12-05 12:03:48.187 187212 DEBUG oslo_concurrency.lockutils [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-456f1972-6ed7-4fc2-b046-fa035704d434" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:48 np0005546909 nova_compute[187208]: 2025-12-05 12:03:48.187 187212 DEBUG oslo_concurrency.lockutils [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-456f1972-6ed7-4fc2-b046-fa035704d434" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:48 np0005546909 nova_compute[187208]: 2025-12-05 12:03:48.187 187212 DEBUG nova.network.neutron [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Refreshing network info cache for port 4f7ea95e-e59f-4941-83b6-5c482617a975 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:03:48 np0005546909 nova_compute[187208]: 2025-12-05 12:03:48.397 187212 INFO nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Creating config drive at /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.config#033[00m
Dec  5 07:03:48 np0005546909 nova_compute[187208]: 2025-12-05 12:03:48.402 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeyj7g2gt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:48 np0005546909 nova_compute[187208]: 2025-12-05 12:03:48.533 187212 DEBUG oslo_concurrency.processutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeyj7g2gt" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:48 np0005546909 kernel: tap909107ba-c9: entered promiscuous mode
Dec  5 07:03:48 np0005546909 NetworkManager[55691]: <info>  [1764936228.6054] manager: (tap909107ba-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/132)
Dec  5 07:03:48 np0005546909 systemd-udevd[221340]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:03:48 np0005546909 nova_compute[187208]: 2025-12-05 12:03:48.613 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:48Z|00303|binding|INFO|Claiming lport 909107ba-c90a-4004-a47f-e5367cab8f82 for this chassis.
Dec  5 07:03:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:48Z|00304|binding|INFO|909107ba-c90a-4004-a47f-e5367cab8f82: Claiming fa:16:3e:16:8e:a6 10.100.0.12
Dec  5 07:03:48 np0005546909 NetworkManager[55691]: <info>  [1764936228.6251] device (tap909107ba-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:03:48 np0005546909 NetworkManager[55691]: <info>  [1764936228.6260] device (tap909107ba-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:03:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:48Z|00305|binding|INFO|Setting lport 909107ba-c90a-4004-a47f-e5367cab8f82 ovn-installed in OVS
Dec  5 07:03:48 np0005546909 nova_compute[187208]: 2025-12-05 12:03:48.635 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:48 np0005546909 nova_compute[187208]: 2025-12-05 12:03:48.640 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:48 np0005546909 systemd-machined[153543]: New machine qemu-43-instance-00000026.
Dec  5 07:03:48 np0005546909 systemd[1]: Started Virtual Machine qemu-43-instance-00000026.
Dec  5 07:03:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:48Z|00306|binding|INFO|Setting lport 909107ba-c90a-4004-a47f-e5367cab8f82 up in Southbound
Dec  5 07:03:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:48.990 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:8e:a6 10.100.0.12'], port_security=['fa:16:3e:16:8e:a6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7df02f69-ecc9-424d-82ab-dc8ba279ffd5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=909107ba-c90a-4004-a47f-e5367cab8f82) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:48.992 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 909107ba-c90a-4004-a47f-e5367cab8f82 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 bound to our chassis#033[00m
Dec  5 07:03:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:48.994 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36#033[00m
Dec  5 07:03:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.012 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af5d40c8-bb7c-45fb-b27c-6515486fc366]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.050 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9a243c6b-73aa-4d3e-a6bb-cdd9bf3a31d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.054 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f2943ac3-f567-4a84-b1d8-951bad60a961]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.091 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ed8c6570-3d4a-4e92-9ccb-6ff31beb7fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.109 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9b1cf392-e299-4f0f-9d91-28effeb36615]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ea1ed6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:fb:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360757, 'reachable_time': 17404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221487, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.133 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[96f0f335-0172-422e-b7aa-baf1c85cd8a2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb8ea1ed6-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360767, 'tstamp': 360767}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221488, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb8ea1ed6-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360771, 'tstamp': 360771}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221488, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.135 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ea1ed6-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.137 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.148 187212 DEBUG nova.compute.manager [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.149 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.149 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8ea1ed6-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.150 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.151 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8ea1ed6-90, col_values=(('external_ids', {'iface-id': '6f012c31-72e4-4df5-be68-787aa910fb9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:49 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:49.151 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.173 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-changed-909107ba-c90a-4004-a47f-e5367cab8f82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.174 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Refreshing instance network info cache due to event network-changed-909107ba-c90a-4004-a47f-e5367cab8f82. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.174 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-7df02f69-ecc9-424d-82ab-dc8ba279ffd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.174 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-7df02f69-ecc9-424d-82ab-dc8ba279ffd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.175 187212 DEBUG nova.network.neutron [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Refreshing network info cache for port 909107ba-c90a-4004-a47f-e5367cab8f82 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.262 187212 INFO nova.compute.manager [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] instance snapshotting#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.263 187212 WARNING nova.compute.manager [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] trying to snapshot a non-running instance: (state: 4 expected: 1)#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.607 187212 INFO nova.virt.libvirt.driver [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Beginning cold snapshot process#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.854 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936229.8540106, 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:49 np0005546909 nova_compute[187208]: 2025-12-05 12:03:49.855 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] VM Started (Lifecycle Event)#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.157 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.162 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936229.8566923, 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.162 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.193 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.196 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.199 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.199 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.200 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.200 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.200 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.201 187212 INFO nova.compute.manager [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Terminating instance#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.202 187212 DEBUG nova.compute.manager [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.226 187212 DEBUG nova.privsep.utils [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:03:50 np0005546909 kernel: tap99a1ab7f-bf (unregistering): left promiscuous mode
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.226 187212 DEBUG oslo_concurrency.processutils [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk /var/lib/nova/instances/snapshots/tmp_tvfw6sw/a90bdd0383f44c6e9298f49095fb64de execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:50 np0005546909 NetworkManager[55691]: <info>  [1764936230.2357] device (tap99a1ab7f-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:03:50 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:50Z|00307|binding|INFO|Releasing lport 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc from this chassis (sb_readonly=0)
Dec  5 07:03:50 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:50Z|00308|binding|INFO|Setting lport 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc down in Southbound
Dec  5 07:03:50 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:50Z|00309|binding|INFO|Removing iface tap99a1ab7f-bf ovn-installed in OVS
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.253 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.256 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.264 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:8a:0c 10.100.0.12'], port_security=['fa:16:3e:a9:8a:0c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd70544d6-04e3-4b2a-914a-72db3052216a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39702279-01de-4f4b-bc33-58c8c6f673e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55d3be64e01442ca8f492d2f3e10d1cc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7da5af47-2519-44c3-bc78-6f5347e93e10', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94b6aed5-905a-43ff-81d8-6adfe368f476, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.267 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 99a1ab7f-bf64-4cc9-846c-9748ff4a93dc in datapath 39702279-01de-4f4b-bc33-58c8c6f673e3 unbound from our chassis#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.272 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 39702279-01de-4f4b-bc33-58c8c6f673e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.273 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f61ffb8d-9303-4707-81a7-06b874aa8526]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.274 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3 namespace which is not needed anymore#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.278 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.306 187212 DEBUG nova.compute.manager [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:50 np0005546909 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Dec  5 07:03:50 np0005546909 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000001f.scope: Consumed 13.786s CPU time.
Dec  5 07:03:50 np0005546909 systemd-machined[153543]: Machine qemu-35-instance-0000001f terminated.
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.364 187212 INFO nova.compute.manager [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] instance snapshotting#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.400 187212 DEBUG oslo_concurrency.processutils [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0/disk /var/lib/nova/instances/snapshots/tmp_tvfw6sw/a90bdd0383f44c6e9298f49095fb64de" returned: 0 in 0.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.400 187212 INFO nova.virt.libvirt.driver [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:03:50 np0005546909 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [NOTICE]   (219283) : haproxy version is 2.8.14-c23fe91
Dec  5 07:03:50 np0005546909 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [NOTICE]   (219283) : path to executable is /usr/sbin/haproxy
Dec  5 07:03:50 np0005546909 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [WARNING]  (219283) : Exiting Master process...
Dec  5 07:03:50 np0005546909 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [ALERT]    (219283) : Current worker (219285) exited with code 143 (Terminated)
Dec  5 07:03:50 np0005546909 neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3[219279]: [WARNING]  (219283) : All workers exited. Exiting... (0)
Dec  5 07:03:50 np0005546909 systemd[1]: libpod-004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff.scope: Deactivated successfully.
Dec  5 07:03:50 np0005546909 podman[221529]: 2025-12-05 12:03:50.424656518 +0000 UTC m=+0.053280443 container died 004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  5 07:03:50 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff-userdata-shm.mount: Deactivated successfully.
Dec  5 07:03:50 np0005546909 systemd[1]: var-lib-containers-storage-overlay-fc8314cf85594aa36784fe5dbf1012ba087261d9468b0e3431f9bb9c756a87d7-merged.mount: Deactivated successfully.
Dec  5 07:03:50 np0005546909 podman[221529]: 2025-12-05 12:03:50.472252967 +0000 UTC m=+0.100876892 container cleanup 004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:03:50 np0005546909 systemd[1]: libpod-conmon-004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff.scope: Deactivated successfully.
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.485 187212 INFO nova.virt.libvirt.driver [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Instance destroyed successfully.#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.486 187212 DEBUG nova.objects.instance [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lazy-loading 'resources' on Instance uuid d70544d6-04e3-4b2a-914a-72db3052216a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.515 187212 DEBUG nova.virt.libvirt.vif [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:02:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-1222437752',display_name='tempest-ImagesOneServerTestJSON-server-1222437752',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-1222437752',id=31,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='55d3be64e01442ca8f492d2f3e10d1cc',ramdisk_id='',reservation_id='r-zbitw7u9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerTestJSON-1350277374',owner_user_name='tempest-ImagesOneServerTestJSON-1350277374-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:40Z,user_data=None,user_id='b5f1bf811e6c42d699922035de0b538c',uuid=d70544d6-04e3-4b2a-914a-72db3052216a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.515 187212 DEBUG nova.network.os_vif_util [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Converting VIF {"id": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "address": "fa:16:3e:a9:8a:0c", "network": {"id": "39702279-01de-4f4b-bc33-58c8c6f673e3", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-72201613-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "55d3be64e01442ca8f492d2f3e10d1cc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap99a1ab7f-bf", "ovs_interfaceid": "99a1ab7f-bf64-4cc9-846c-9748ff4a93dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.516 187212 DEBUG nova.network.os_vif_util [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.516 187212 DEBUG os_vif [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.520 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.520 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99a1ab7f-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.522 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.525 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.527 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.531 187212 INFO os_vif [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:8a:0c,bridge_name='br-int',has_traffic_filtering=True,id=99a1ab7f-bf64-4cc9-846c-9748ff4a93dc,network=Network(39702279-01de-4f4b-bc33-58c8c6f673e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap99a1ab7f-bf')#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.532 187212 INFO nova.virt.libvirt.driver [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Deleting instance files /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a_del#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.532 187212 INFO nova.virt.libvirt.driver [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Deletion of /var/lib/nova/instances/d70544d6-04e3-4b2a-914a-72db3052216a_del complete#033[00m
Dec  5 07:03:50 np0005546909 podman[221578]: 2025-12-05 12:03:50.547697472 +0000 UTC m=+0.050956426 container remove 004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.555 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bf88cf73-6c22-413f-b415-254758d323f1]: (4, ('Fri Dec  5 12:03:50 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3 (004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff)\n004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff\nFri Dec  5 12:03:50 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3 (004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff)\n004459aea328c889b4f166a4b3efa938d39699cbfef17f05a65911c3852178ff\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.557 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6f282c85-0421-4731-acb7-ccb02f4ede8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.558 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39702279-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:50 np0005546909 kernel: tap39702279-00: left promiscuous mode
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.631 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.640 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.642 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f7239d9e-585f-4819-9478-f0a8e6624161]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.645 187212 INFO nova.compute.manager [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.646 187212 DEBUG oslo.service.loopingcall [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.646 187212 INFO nova.virt.libvirt.driver [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Beginning live snapshot process#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.649 187212 DEBUG nova.compute.manager [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.649 187212 DEBUG nova.network.neutron [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.655 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[24ceaffd-fda4-4577-9478-05ada5027d81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.656 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d141c7f1-b1c2-456e-97ba-d0d07bfe0845]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.665 187212 DEBUG nova.network.neutron [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Updated VIF entry in instance network info cache for port 4f7ea95e-e59f-4941-83b6-5c482617a975. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.665 187212 DEBUG nova.network.neutron [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Updating instance_info_cache with network_info: [{"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.677 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[26f6d425-7897-43a6-a983-42336b378b07]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355488, 'reachable_time': 18343, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221596, 'error': None, 'target': 'ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.679 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-39702279-01de-4f4b-bc33-58c8c6f673e3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:03:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:50.680 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[fe677c4d-d778-4dc0-b128-c73e8aac5ed6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:50 np0005546909 systemd[1]: run-netns-ovnmeta\x2d39702279\x2d01de\x2d4f4b\x2dbc33\x2d58c8c6f673e3.mount: Deactivated successfully.
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.712 187212 DEBUG oslo_concurrency.lockutils [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-456f1972-6ed7-4fc2-b046-fa035704d434" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.712 187212 DEBUG nova.compute.manager [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.713 187212 DEBUG oslo_concurrency.lockutils [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.713 187212 DEBUG oslo_concurrency.lockutils [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.713 187212 DEBUG oslo_concurrency.lockutils [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.713 187212 DEBUG nova.compute.manager [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] No waiting events found dispatching network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.714 187212 WARNING nova.compute.manager [req-0e8a9eb9-0f27-4208-8696-b5b6a2eec16b req-850f0dc9-3f34-4db8-a463-c5f3ec96e6b2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received unexpected event network-vif-plugged-ecec1a41-6f3e-4852-8cdb-9d461eded987 for instance with vm_state suspended and task_state None.#033[00m
Dec  5 07:03:50 np0005546909 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.795 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.891 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json -f qcow2" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.892 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.956 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8/disk --force-share --output=json -f qcow2" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:50 np0005546909 nova_compute[187208]: 2025-12-05 12:03:50.969 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.046 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.048 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmprt1scfb2/b3d144cf53d9408397995d9f84702b9c.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.085 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmprt1scfb2/b3d144cf53d9408397995d9f84702b9c.delta 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.087 187212 INFO nova.virt.libvirt.driver [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.138 187212 DEBUG nova.virt.libvirt.guest [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] COPY block job progress, current cursor: 0 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.157 187212 DEBUG nova.compute.manager [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.158 187212 DEBUG oslo_concurrency.lockutils [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.158 187212 DEBUG oslo_concurrency.lockutils [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.159 187212 DEBUG oslo_concurrency.lockutils [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.159 187212 DEBUG nova.compute.manager [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Processing event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.159 187212 DEBUG nova.compute.manager [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.160 187212 DEBUG oslo_concurrency.lockutils [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.160 187212 DEBUG oslo_concurrency.lockutils [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.160 187212 DEBUG oslo_concurrency.lockutils [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.161 187212 DEBUG nova.compute.manager [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] No waiting events found dispatching network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.161 187212 WARNING nova.compute.manager [req-e5d3d81c-a9f8-4c22-a5cc-bcf31d2a846b req-d16117f3-0002-4843-8c15-d72b903d02bd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received unexpected event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.167 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.174 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936231.1735473, 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.174 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.178 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.183 187212 INFO nova.virt.libvirt.driver [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Instance spawned successfully.#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.184 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.238 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.248 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.254 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.255 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.255 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.256 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.256 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.257 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.428 187212 DEBUG nova.network.neutron [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Updated VIF entry in instance network info cache for port 909107ba-c90a-4004-a47f-e5367cab8f82. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.429 187212 DEBUG nova.network.neutron [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Updating instance_info_cache with network_info: [{"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.456 187212 DEBUG nova.compute.manager [req-8e923e87-2120-452d-a2d9-c363a038fb63 req-dd902833-d392-40fa-bbdb-567e784f40fe 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-vif-unplugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.456 187212 DEBUG oslo_concurrency.lockutils [req-8e923e87-2120-452d-a2d9-c363a038fb63 req-dd902833-d392-40fa-bbdb-567e784f40fe 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.457 187212 DEBUG oslo_concurrency.lockutils [req-8e923e87-2120-452d-a2d9-c363a038fb63 req-dd902833-d392-40fa-bbdb-567e784f40fe 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.457 187212 DEBUG oslo_concurrency.lockutils [req-8e923e87-2120-452d-a2d9-c363a038fb63 req-dd902833-d392-40fa-bbdb-567e784f40fe 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.457 187212 DEBUG nova.compute.manager [req-8e923e87-2120-452d-a2d9-c363a038fb63 req-dd902833-d392-40fa-bbdb-567e784f40fe 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] No waiting events found dispatching network-vif-unplugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.457 187212 DEBUG nova.compute.manager [req-8e923e87-2120-452d-a2d9-c363a038fb63 req-dd902833-d392-40fa-bbdb-567e784f40fe 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-vif-unplugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.474 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.643 187212 DEBUG nova.virt.libvirt.guest [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] COPY block job progress, current cursor: 75366400 final cursor: 75366400 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.645 187212 INFO nova.virt.libvirt.driver [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.683 187212 DEBUG nova.privsep.utils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.683 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmprt1scfb2/b3d144cf53d9408397995d9f84702b9c.delta /var/lib/nova/instances/snapshots/tmprt1scfb2/b3d144cf53d9408397995d9f84702b9c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.805 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-7df02f69-ecc9-424d-82ab-dc8ba279ffd5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.806 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.807 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.807 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.807 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.808 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Processing event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.808 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.809 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.809 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.809 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.810 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] No waiting events found dispatching network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.810 187212 WARNING nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received unexpected event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.810 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.811 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.811 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.812 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.812 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Processing event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.812 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.813 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.813 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.813 187212 DEBUG oslo_concurrency.lockutils [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.814 187212 DEBUG nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] No waiting events found dispatching network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.814 187212 WARNING nova.compute.manager [req-7df8a999-3c6f-442a-918d-9456c1211925 req-0519c703-f561-4c45-8dff-a5a91cd78bce 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received unexpected event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.815 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.816 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.820 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936231.8198025, 30a55909-059f-4a0c-9598-14cc506d42a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.820 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.824 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.825 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.830 187212 INFO nova.virt.libvirt.driver [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Instance spawned successfully.#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.831 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.834 187212 INFO nova.virt.libvirt.driver [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Instance spawned successfully.#033[00m
Dec  5 07:03:51 np0005546909 nova_compute[187208]: 2025-12-05 12:03:51.836 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.052 187212 INFO nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Took 11.47 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.053 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.073 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.078 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.078 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.079 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.079 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.080 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.080 187212 DEBUG nova.virt.libvirt.driver [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.086 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.111 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.111 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.112 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.112 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.113 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.113 187212 DEBUG nova.virt.libvirt.driver [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.119 187212 DEBUG oslo_concurrency.processutils [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmprt1scfb2/b3d144cf53d9408397995d9f84702b9c.delta /var/lib/nova/instances/snapshots/tmprt1scfb2/b3d144cf53d9408397995d9f84702b9c" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.124 187212 INFO nova.virt.libvirt.driver [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.207 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.208 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936231.8248255, 456f1972-6ed7-4fc2-b046-fa035704d434 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.209 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.254 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.258 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.269 187212 INFO nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Took 13.48 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.270 187212 DEBUG nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.273 187212 INFO nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Took 12.74 seconds to build instance.#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.284 187212 INFO nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Took 11.38 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.285 187212 DEBUG nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.306 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.346 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.925s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.347 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 10.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.347 187212 INFO nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.348 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.385 187212 INFO nova.compute.manager [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Took 14.58 seconds to build instance.#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.393 187212 INFO nova.compute.manager [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Took 12.61 seconds to build instance.#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.561 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.682 187212 DEBUG oslo_concurrency.lockutils [None req-c1a12120-a410-4e1c-92f2-798aef749ceb 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.683 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "456f1972-6ed7-4fc2-b046-fa035704d434" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 10.725s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.683 187212 INFO nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.683 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "456f1972-6ed7-4fc2-b046-fa035704d434" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.689 187212 DEBUG oslo_concurrency.lockutils [None req-f9a3364c-b18b-4159-ba46-1c342adbb18e ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.990s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.690 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "30a55909-059f-4a0c-9598-14cc506d42a2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 10.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.692 187212 INFO nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.693 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "30a55909-059f-4a0c-9598-14cc506d42a2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:52 np0005546909 nova_compute[187208]: 2025-12-05 12:03:52.735 187212 WARNING nova.compute.manager [None req-53c2eb48-268e-4a44-8c29-6e97fe238441 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Image not found during snapshot: nova.exception.ImageNotFound: Image f951ed45-f10d-4ac3-a0fc-5d19a12add95 could not be found.#033[00m
Dec  5 07:03:53 np0005546909 podman[221634]: 2025-12-05 12:03:53.219385234 +0000 UTC m=+0.057931496 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:03:53 np0005546909 podman[221633]: 2025-12-05 12:03:53.260940771 +0000 UTC m=+0.095358025 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Dec  5 07:03:53 np0005546909 nova_compute[187208]: 2025-12-05 12:03:53.410 187212 DEBUG nova.network.neutron [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:53 np0005546909 nova_compute[187208]: 2025-12-05 12:03:53.433 187212 INFO nova.compute.manager [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Took 2.78 seconds to deallocate network for instance.#033[00m
Dec  5 07:03:53 np0005546909 nova_compute[187208]: 2025-12-05 12:03:53.498 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:53 np0005546909 nova_compute[187208]: 2025-12-05 12:03:53.499 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:53 np0005546909 nova_compute[187208]: 2025-12-05 12:03:53.539 187212 DEBUG nova.scheduler.client.report [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 07:03:53 np0005546909 nova_compute[187208]: 2025-12-05 12:03:53.558 187212 DEBUG nova.scheduler.client.report [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 07:03:53 np0005546909 nova_compute[187208]: 2025-12-05 12:03:53.559 187212 DEBUG nova.compute.provider_tree [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:03:53 np0005546909 nova_compute[187208]: 2025-12-05 12:03:53.763 187212 DEBUG nova.scheduler.client.report [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 07:03:53 np0005546909 nova_compute[187208]: 2025-12-05 12:03:53.788 187212 DEBUG nova.scheduler.client.report [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 07:03:54 np0005546909 nova_compute[187208]: 2025-12-05 12:03:54.839 187212 DEBUG nova.compute.provider_tree [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:54 np0005546909 nova_compute[187208]: 2025-12-05 12:03:54.858 187212 DEBUG nova.scheduler.client.report [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:54 np0005546909 nova_compute[187208]: 2025-12-05 12:03:54.883 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:54 np0005546909 nova_compute[187208]: 2025-12-05 12:03:54.928 187212 INFO nova.scheduler.client.report [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Deleted allocations for instance d70544d6-04e3-4b2a-914a-72db3052216a#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.015 187212 DEBUG oslo_concurrency.lockutils [None req-1817057c-542a-4147-a028-5c21a3417348 b5f1bf811e6c42d699922035de0b538c 55d3be64e01442ca8f492d2f3e10d1cc - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.051 187212 INFO nova.virt.libvirt.driver [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Snapshot image upload complete#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.051 187212 INFO nova.compute.manager [None req-1cc767c8-9a4f-42b0-b2ae-19986b2ce204 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Took 5.79 seconds to snapshot the instance on the hypervisor.#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.133 187212 DEBUG nova.compute.manager [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.133 187212 DEBUG oslo_concurrency.lockutils [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.134 187212 DEBUG oslo_concurrency.lockutils [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.134 187212 DEBUG oslo_concurrency.lockutils [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d70544d6-04e3-4b2a-914a-72db3052216a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.134 187212 DEBUG nova.compute.manager [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] No waiting events found dispatching network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.134 187212 WARNING nova.compute.manager [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received unexpected event network-vif-plugged-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.135 187212 DEBUG nova.compute.manager [req-2916d5ca-3dc5-4449-bfcb-d1ce24800bac req-f015d392-01c2-4171-a230-cd811d71f4fd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Received event network-vif-deleted-99a1ab7f-bf64-4cc9-846c-9748ff4a93dc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.147 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936220.1460907, 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.147 187212 INFO nova.compute.manager [-] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.177 187212 DEBUG nova.compute.manager [None req-188187e6-77dc-4f1c-8fe6-d6620a329879 - - - - - -] [instance: 05008cd8-8cac-482b-9ff8-68f2f0aaa6d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:55 np0005546909 nova_compute[187208]: 2025-12-05 12:03:55.583 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.227 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.228 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.228 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.228 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.229 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.230 187212 INFO nova.compute.manager [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Terminating instance#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.230 187212 DEBUG nova.compute.manager [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.232 187212 INFO nova.compute.manager [None req-7a7b1fde-7875-4bb0-a8e8-c57375c20d5a ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Pausing#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.232 187212 DEBUG nova.objects.instance [None req-7a7b1fde-7875-4bb0-a8e8-c57375c20d5a ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'flavor' on Instance uuid 30a55909-059f-4a0c-9598-14cc506d42a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:56 np0005546909 kernel: tap0a11e563-2b (unregistering): left promiscuous mode
Dec  5 07:03:56 np0005546909 NetworkManager[55691]: <info>  [1764936236.2603] device (tap0a11e563-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.272 187212 DEBUG nova.compute.manager [None req-7a7b1fde-7875-4bb0-a8e8-c57375c20d5a ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.274 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936236.2743678, 30a55909-059f-4a0c-9598-14cc506d42a2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.275 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:03:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:56Z|00310|binding|INFO|Releasing lport 0a11e563-2be9-4ce9-af51-7d29b586e233 from this chassis (sb_readonly=0)
Dec  5 07:03:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:56Z|00311|binding|INFO|Setting lport 0a11e563-2be9-4ce9-af51-7d29b586e233 down in Southbound
Dec  5 07:03:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:56Z|00312|binding|INFO|Removing iface tap0a11e563-2b ovn-installed in OVS
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.279 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.284 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:70:f2 10.100.0.12'], port_security=['fa:16:3e:f2:70:f2 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd2085dd9-2ebd-4804-99c1-3b15cbd216f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d064000-316c-46a7-a23c-1dc26318b6a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79895287bd1d488c842f6013729a1f81', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e1ec2415-6840-4cf9-b5ac-efaf1a9c9a58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3804b014-203a-4c47-b0bb-7634579c4ec4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=0a11e563-2be9-4ce9-af51-7d29b586e233) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.285 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 0a11e563-2be9-4ce9-af51-7d29b586e233 in datapath 5d064000-316c-46a7-a23c-1dc26318b6a4 unbound from our chassis#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.287 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d064000-316c-46a7-a23c-1dc26318b6a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.289 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[da63f076-97d5-42ad-a980-db288b5f6b20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.289 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 namespace which is not needed anymore#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.291 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.301 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.307 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:56 np0005546909 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000021.scope: Deactivated successfully.
Dec  5 07:03:56 np0005546909 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d00000021.scope: Consumed 14.127s CPU time.
Dec  5 07:03:56 np0005546909 systemd-machined[153543]: Machine qemu-37-instance-00000021 terminated.
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.331 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Dec  5 07:03:56 np0005546909 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [NOTICE]   (220123) : haproxy version is 2.8.14-c23fe91
Dec  5 07:03:56 np0005546909 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [NOTICE]   (220123) : path to executable is /usr/sbin/haproxy
Dec  5 07:03:56 np0005546909 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [WARNING]  (220123) : Exiting Master process...
Dec  5 07:03:56 np0005546909 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [ALERT]    (220123) : Current worker (220125) exited with code 143 (Terminated)
Dec  5 07:03:56 np0005546909 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[220119]: [WARNING]  (220123) : All workers exited. Exiting... (0)
Dec  5 07:03:56 np0005546909 systemd[1]: libpod-592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c.scope: Deactivated successfully.
Dec  5 07:03:56 np0005546909 podman[221693]: 2025-12-05 12:03:56.429380602 +0000 UTC m=+0.057639538 container died 592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:03:56 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c-userdata-shm.mount: Deactivated successfully.
Dec  5 07:03:56 np0005546909 systemd[1]: var-lib-containers-storage-overlay-f72b34854b6787c4cd40b26e6f22d36e2f45382e4691a696a9fc490f51c1bb73-merged.mount: Deactivated successfully.
Dec  5 07:03:56 np0005546909 podman[221693]: 2025-12-05 12:03:56.47063025 +0000 UTC m=+0.098889186 container cleanup 592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:03:56 np0005546909 systemd[1]: libpod-conmon-592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c.scope: Deactivated successfully.
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.498 187212 INFO nova.virt.libvirt.driver [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Instance destroyed successfully.#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.498 187212 DEBUG nova.objects.instance [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'resources' on Instance uuid d2085dd9-2ebd-4804-99c1-3b15cbd216f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.515 187212 DEBUG nova.virt.libvirt.vif [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-835443144',display_name='tempest-ImagesOneServerNegativeTestJSON-server-835443144',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-835443144',id=33,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-ijey2289',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:52Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=d2085dd9-2ebd-4804-99c1-3b15cbd216f8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.515 187212 DEBUG nova.network.os_vif_util [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "0a11e563-2be9-4ce9-af51-7d29b586e233", "address": "fa:16:3e:f2:70:f2", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a11e563-2b", "ovs_interfaceid": "0a11e563-2be9-4ce9-af51-7d29b586e233", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.517 187212 DEBUG nova.network.os_vif_util [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.517 187212 DEBUG os_vif [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.519 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.519 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a11e563-2b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.520 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.522 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.525 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.528 187212 INFO os_vif [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:70:f2,bridge_name='br-int',has_traffic_filtering=True,id=0a11e563-2be9-4ce9-af51-7d29b586e233,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a11e563-2b')#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.529 187212 INFO nova.virt.libvirt.driver [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Deleting instance files /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8_del#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.529 187212 INFO nova.virt.libvirt.driver [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Deletion of /var/lib/nova/instances/d2085dd9-2ebd-4804-99c1-3b15cbd216f8_del complete#033[00m
Dec  5 07:03:56 np0005546909 podman[221737]: 2025-12-05 12:03:56.54730073 +0000 UTC m=+0.052039048 container remove 592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.554 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c067939f-cbca-4325-8d52-8616a98d9e93]: (4, ('Fri Dec  5 12:03:56 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 (592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c)\n592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c\nFri Dec  5 12:03:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 (592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c)\n592544acdbdb3faad9ebf5fdb1b308769ee4752833d638e9f39ac041a83f097c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.556 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4318c6-8faa-49e1-8b0a-c9dd9c81f282]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.557 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d064000-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.559 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:56 np0005546909 kernel: tap5d064000-30: left promiscuous mode
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.578 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.579 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[05ad57f4-88b0-43d9-aa48-f1a46e063d4f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.581 187212 INFO nova.compute.manager [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.582 187212 DEBUG oslo.service.loopingcall [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.582 187212 DEBUG nova.compute.manager [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:03:56 np0005546909 nova_compute[187208]: 2025-12-05 12:03:56.583 187212 DEBUG nova.network.neutron [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.595 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a4fb9df3-7feb-4d57-8396-617cc9364865]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.598 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff18bb6-e111-4193-80e2-861143be0326]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.614 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7caceb30-bef4-4611-b152-c5182ab541f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 357459, 'reachable_time': 17903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221757, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.617 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:03:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:56.617 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[2cbf1065-3a9e-4209-a7bd-40200c316540]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:56 np0005546909 systemd[1]: run-netns-ovnmeta\x2d5d064000\x2d316c\x2d46a7\x2da23c\x2d1dc26318b6a4.mount: Deactivated successfully.
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.560 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.623 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.624 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.624 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.625 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.625 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.626 187212 INFO nova.compute.manager [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Terminating instance#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.627 187212 DEBUG nova.compute.manager [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:03:57 np0005546909 kernel: tap909107ba-c9 (unregistering): left promiscuous mode
Dec  5 07:03:57 np0005546909 NetworkManager[55691]: <info>  [1764936237.6467] device (tap909107ba-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.652 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:57 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:57Z|00313|binding|INFO|Releasing lport 909107ba-c90a-4004-a47f-e5367cab8f82 from this chassis (sb_readonly=0)
Dec  5 07:03:57 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:57Z|00314|binding|INFO|Setting lport 909107ba-c90a-4004-a47f-e5367cab8f82 down in Southbound
Dec  5 07:03:57 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:57Z|00315|binding|INFO|Removing iface tap909107ba-c9 ovn-installed in OVS
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.655 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.660 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:16:8e:a6 10.100.0.12'], port_security=['fa:16:3e:16:8e:a6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7df02f69-ecc9-424d-82ab-dc8ba279ffd5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=909107ba-c90a-4004-a47f-e5367cab8f82) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.661 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 909107ba-c90a-4004-a47f-e5367cab8f82 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 unbound from our chassis#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.663 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.673 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.678 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a167f52a-33a3-448d-b939-aa21f4ee2f14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.704 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e460f771-7073-47e5-b169-e9b5f135ad0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.709 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[dec63153-74c0-4a5d-95b0-b23dd94a8845]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:57 np0005546909 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000026.scope: Deactivated successfully.
Dec  5 07:03:57 np0005546909 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000026.scope: Consumed 7.758s CPU time.
Dec  5 07:03:57 np0005546909 systemd-machined[153543]: Machine qemu-43-instance-00000026 terminated.
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.738 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d33766db-a46f-43cf-8bdd-bd8b2c900093]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.757 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ae0facf9-d60b-4bec-84fb-a70f86778f83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb8ea1ed6-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:fb:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360757, 'reachable_time': 17404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221769, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.775 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[28b9a832-3d4f-46db-82a7-a1a8a6b1d70b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb8ea1ed6-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360767, 'tstamp': 360767}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221770, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb8ea1ed6-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360771, 'tstamp': 360771}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221770, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.777 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ea1ed6-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.791 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.799 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.800 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb8ea1ed6-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.800 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.801 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb8ea1ed6-90, col_values=(('external_ids', {'iface-id': '6f012c31-72e4-4df5-be68-787aa910fb9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:57.801 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.883 187212 DEBUG nova.compute.manager [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-vif-unplugged-0a11e563-2be9-4ce9-af51-7d29b586e233 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.884 187212 DEBUG oslo_concurrency.lockutils [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.884 187212 DEBUG oslo_concurrency.lockutils [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.884 187212 DEBUG oslo_concurrency.lockutils [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.884 187212 DEBUG nova.compute.manager [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] No waiting events found dispatching network-vif-unplugged-0a11e563-2be9-4ce9-af51-7d29b586e233 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.885 187212 DEBUG nova.compute.manager [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-vif-unplugged-0a11e563-2be9-4ce9-af51-7d29b586e233 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.885 187212 DEBUG nova.compute.manager [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.885 187212 DEBUG oslo_concurrency.lockutils [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.886 187212 DEBUG oslo_concurrency.lockutils [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.886 187212 DEBUG oslo_concurrency.lockutils [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.886 187212 DEBUG nova.compute.manager [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] No waiting events found dispatching network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.887 187212 WARNING nova.compute.manager [req-fbb2a6a5-bd53-40e7-b550-5d3416e13853 req-94ae45d5-131c-490d-893f-956dd8b4e5a3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received unexpected event network-vif-plugged-0a11e563-2be9-4ce9-af51-7d29b586e233 for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.911 187212 INFO nova.virt.libvirt.driver [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Instance destroyed successfully.#033[00m
Dec  5 07:03:57 np0005546909 nova_compute[187208]: 2025-12-05 12:03:57.912 187212 DEBUG nova.objects.instance [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'resources' on Instance uuid 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.284 187212 DEBUG nova.virt.libvirt.vif [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2007104146',display_name='tempest-tempest.common.compute-instance-2007104146-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2007104146-1',id=38,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-slnm8hyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:52Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=7df02f69-ecc9-424d-82ab-dc8ba279ffd5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.285 187212 DEBUG nova.network.os_vif_util [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "909107ba-c90a-4004-a47f-e5367cab8f82", "address": "fa:16:3e:16:8e:a6", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap909107ba-c9", "ovs_interfaceid": "909107ba-c90a-4004-a47f-e5367cab8f82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.286 187212 DEBUG nova.network.os_vif_util [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.286 187212 DEBUG os_vif [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.288 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.288 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap909107ba-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.290 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.292 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.295 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.298 187212 INFO os_vif [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:16:8e:a6,bridge_name='br-int',has_traffic_filtering=True,id=909107ba-c90a-4004-a47f-e5367cab8f82,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap909107ba-c9')#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.298 187212 INFO nova.virt.libvirt.driver [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Deleting instance files /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5_del#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.299 187212 INFO nova.virt.libvirt.driver [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Deletion of /var/lib/nova/instances/7df02f69-ecc9-424d-82ab-dc8ba279ffd5_del complete#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.342 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.343 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.343 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.344 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.344 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.345 187212 INFO nova.compute.manager [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Terminating instance#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.346 187212 DEBUG nova.compute.manager [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.353 187212 INFO nova.compute.manager [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Took 0.73 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.353 187212 DEBUG oslo.service.loopingcall [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.353 187212 DEBUG nova.compute.manager [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.354 187212 DEBUG nova.network.neutron [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:03:58 np0005546909 kernel: tap4f7ea95e-e5 (unregistering): left promiscuous mode
Dec  5 07:03:58 np0005546909 NetworkManager[55691]: <info>  [1764936238.3718] device (tap4f7ea95e-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.382 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:58Z|00316|binding|INFO|Releasing lport 4f7ea95e-e59f-4941-83b6-5c482617a975 from this chassis (sb_readonly=0)
Dec  5 07:03:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:58Z|00317|binding|INFO|Setting lport 4f7ea95e-e59f-4941-83b6-5c482617a975 down in Southbound
Dec  5 07:03:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:58Z|00318|binding|INFO|Removing iface tap4f7ea95e-e5 ovn-installed in OVS
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.385 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.390 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:7b:36 10.100.0.3'], port_security=['fa:16:3e:4a:7b:36 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '456f1972-6ed7-4fc2-b046-fa035704d434', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=4f7ea95e-e59f-4941-83b6-5c482617a975) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.391 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 4f7ea95e-e59f-4941-83b6-5c482617a975 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 unbound from our chassis#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.392 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.393 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1153941c-4190-450d-9f5b-a6c2b86c03d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.393 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 namespace which is not needed anymore#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.397 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000027.scope: Deactivated successfully.
Dec  5 07:03:58 np0005546909 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000027.scope: Consumed 7.189s CPU time.
Dec  5 07:03:58 np0005546909 systemd-machined[153543]: Machine qemu-42-instance-00000027 terminated.
Dec  5 07:03:58 np0005546909 podman[221793]: 2025-12-05 12:03:58.476089273 +0000 UTC m=+0.061173208 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:03:58 np0005546909 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [NOTICE]   (221443) : haproxy version is 2.8.14-c23fe91
Dec  5 07:03:58 np0005546909 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [NOTICE]   (221443) : path to executable is /usr/sbin/haproxy
Dec  5 07:03:58 np0005546909 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [WARNING]  (221443) : Exiting Master process...
Dec  5 07:03:58 np0005546909 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [WARNING]  (221443) : Exiting Master process...
Dec  5 07:03:58 np0005546909 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [ALERT]    (221443) : Current worker (221445) exited with code 143 (Terminated)
Dec  5 07:03:58 np0005546909 neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36[221439]: [WARNING]  (221443) : All workers exited. Exiting... (0)
Dec  5 07:03:58 np0005546909 systemd[1]: libpod-9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1.scope: Deactivated successfully.
Dec  5 07:03:58 np0005546909 podman[221835]: 2025-12-05 12:03:58.536444177 +0000 UTC m=+0.050351559 container died 9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  5 07:03:58 np0005546909 kernel: tap4f7ea95e-e5: entered promiscuous mode
Dec  5 07:03:58 np0005546909 systemd-udevd[221676]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:03:58 np0005546909 NetworkManager[55691]: <info>  [1764936238.5677] manager: (tap4f7ea95e-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/133)
Dec  5 07:03:58 np0005546909 kernel: tap4f7ea95e-e5 (unregistering): left promiscuous mode
Dec  5 07:03:58 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1-userdata-shm.mount: Deactivated successfully.
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.573 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 systemd[1]: var-lib-containers-storage-overlay-cea3cddfc7f419ead74ac9c1e8d910318a876e210e0687b74d3349159defed7c-merged.mount: Deactivated successfully.
Dec  5 07:03:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:58Z|00319|binding|INFO|Claiming lport 4f7ea95e-e59f-4941-83b6-5c482617a975 for this chassis.
Dec  5 07:03:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:58Z|00320|binding|INFO|4f7ea95e-e59f-4941-83b6-5c482617a975: Claiming fa:16:3e:4a:7b:36 10.100.0.3
Dec  5 07:03:58 np0005546909 podman[221835]: 2025-12-05 12:03:58.584336515 +0000 UTC m=+0.098243887 container cleanup 9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec  5 07:03:58 np0005546909 systemd[1]: libpod-conmon-9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1.scope: Deactivated successfully.
Dec  5 07:03:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:58Z|00321|if_status|INFO|Dropped 8 log messages in last 62 seconds (most recently, 62 seconds ago) due to excessive rate
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.595 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:58Z|00322|if_status|INFO|Not setting lport 4f7ea95e-e59f-4941-83b6-5c482617a975 down as sb is readonly
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.598 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 podman[221836]: 2025-12-05 12:03:58.625282454 +0000 UTC m=+0.120113991 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.625 187212 INFO nova.virt.libvirt.driver [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Instance destroyed successfully.#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.625 187212 DEBUG nova.objects.instance [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'resources' on Instance uuid 456f1972-6ed7-4fc2-b046-fa035704d434 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:58Z|00323|binding|INFO|Releasing lport 4f7ea95e-e59f-4941-83b6-5c482617a975 from this chassis (sb_readonly=0)
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.657 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:7b:36 10.100.0.3'], port_security=['fa:16:3e:4a:7b:36 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '456f1972-6ed7-4fc2-b046-fa035704d434', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=4f7ea95e-e59f-4941-83b6-5c482617a975) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:58 np0005546909 podman[221894]: 2025-12-05 12:03:58.659139512 +0000 UTC m=+0.051673667 container remove 9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.664 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:7b:36 10.100.0.3'], port_security=['fa:16:3e:4a:7b:36 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '456f1972-6ed7-4fc2-b046-fa035704d434', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bebbbd9623064681bb9350747fba600e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c41eb71-88d6-42e6-a215-1f895bfe2743', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=847c65d3-b784-4ffe-b1f3-a8b606806b3c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=4f7ea95e-e59f-4941-83b6-5c482617a975) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.665 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d60773fd-f54d-419e-8aae-5f6cf367f618]: (4, ('Fri Dec  5 12:03:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 (9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1)\n9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1\nFri Dec  5 12:03:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 (9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1)\n9e7777dfb9ed460011f24e3efb1ad827eb3cab0240c9edbdf3d7ce30b92560f1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.666 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7e63a4be-a65f-46c6-b627-1e3d642e3307]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.667 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb8ea1ed6-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.668 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.676 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.676 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.677 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.677 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.677 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.678 187212 INFO nova.compute.manager [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Terminating instance#033[00m
Dec  5 07:03:58 np0005546909 kernel: tapb8ea1ed6-90: left promiscuous mode
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.679 187212 DEBUG nova.compute.manager [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.679 187212 DEBUG nova.virt.libvirt.vif [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-2007104146',display_name='tempest-tempest.common.compute-instance-2007104146-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-2007104146-2',id=39,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-12-05T12:03:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bebbbd9623064681bb9350747fba600e',ramdisk_id='',reservation_id='r-slnm8hyg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-1941206426',owner_user_name='tempest-MultipleCreateTestJSON-1941206426-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:52Z,user_data=None,user_id='40620135b1ff4f8d9d80eb79f51fd593',uuid=456f1972-6ed7-4fc2-b046-fa035704d434,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.680 187212 DEBUG nova.network.os_vif_util [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converting VIF {"id": "4f7ea95e-e59f-4941-83b6-5c482617a975", "address": "fa:16:3e:4a:7b:36", "network": {"id": "b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-246625249-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bebbbd9623064681bb9350747fba600e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f7ea95e-e5", "ovs_interfaceid": "4f7ea95e-e59f-4941-83b6-5c482617a975", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.680 187212 DEBUG nova.network.os_vif_util [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.680 187212 DEBUG os_vif [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.682 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.682 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f7ea95e-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.683 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.686 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.687 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.688 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.688 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[261ec7eb-7478-45fc-8cf9-b1b5e5dca27f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.690 187212 INFO os_vif [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:7b:36,bridge_name='br-int',has_traffic_filtering=True,id=4f7ea95e-e59f-4941-83b6-5c482617a975,network=Network(b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f7ea95e-e5')#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.690 187212 INFO nova.virt.libvirt.driver [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Deleting instance files /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434_del#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.691 187212 INFO nova.virt.libvirt.driver [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Deletion of /var/lib/nova/instances/456f1972-6ed7-4fc2-b046-fa035704d434_del complete#033[00m
Dec  5 07:03:58 np0005546909 kernel: tap9dc35efb-0a (unregistering): left promiscuous mode
Dec  5 07:03:58 np0005546909 NetworkManager[55691]: <info>  [1764936238.7023] device (tap9dc35efb-0a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.704 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[818f7e74-d8ae-438b-bcb6-76413c14d9d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.705 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[521872f7-516a-4960-b048-97410301b28d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:58Z|00324|binding|INFO|Releasing lport 9dc35efb-0aed-463b-860e-3b60dd65b6db from this chassis (sb_readonly=0)
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.709 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:58Z|00325|binding|INFO|Setting lport 9dc35efb-0aed-463b-860e-3b60dd65b6db down in Southbound
Dec  5 07:03:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:03:58Z|00326|binding|INFO|Removing iface tap9dc35efb-0a ovn-installed in OVS
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.712 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.716 187212 DEBUG nova.network.neutron [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.720 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fc05912f-1ed9-4845-9175-ab9bc3ebd64e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360751, 'reachable_time': 37106, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221922, 'error': None, 'target': 'ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:58 np0005546909 systemd[1]: run-netns-ovnmeta\x2db8ea1ed6\x2d9eec\x2d4cb3\x2da2b6\x2d6146b7b65c36.mount: Deactivated successfully.
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.723 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:04:08 10.100.0.4'], port_security=['fa:16:3e:4b:04:08 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '30a55909-059f-4a0c-9598-14cc506d42a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9dc35efb-0aed-463b-860e-3b60dd65b6db) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.726 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.726 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[89274fc9-3de1-4ddd-8ae5-115954886ee5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.727 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 4f7ea95e-e59f-4941-83b6-5c482617a975 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 unbound from our chassis#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.728 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.729 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.729 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[57ffca8b-0180-4e98-b46f-2e5b15e9091f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.730 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 4f7ea95e-e59f-4941-83b6-5c482617a975 in datapath b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36 unbound from our chassis#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.732 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b8ea1ed6-9eec-4cb3-a2b6-6146b7b65c36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.732 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6885eb8d-2a35-42fb-a1c6-0a0515399090]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.733 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9dc35efb-0aed-463b-860e-3b60dd65b6db in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 unbound from our chassis#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.734 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.735 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[079cabab-c962-47d5-895f-eb4343444743]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:58.735 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace which is not needed anymore#033[00m
Dec  5 07:03:58 np0005546909 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000025.scope: Deactivated successfully.
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.753 187212 INFO nova.compute.manager [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Took 2.17 seconds to deallocate network for instance.#033[00m
Dec  5 07:03:58 np0005546909 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000025.scope: Consumed 4.769s CPU time.
Dec  5 07:03:58 np0005546909 systemd-machined[153543]: Machine qemu-41-instance-00000025 terminated.
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.762 187212 INFO nova.compute.manager [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.762 187212 DEBUG oslo.service.loopingcall [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.762 187212 DEBUG nova.compute.manager [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.763 187212 DEBUG nova.network.neutron [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.768 187212 DEBUG nova.compute.manager [req-82550334-e062-422a-922a-df173f5b6ec7 req-d1f857f1-a6c3-4983-b4a5-c64efa48196e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-vif-unplugged-909107ba-c90a-4004-a47f-e5367cab8f82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.768 187212 DEBUG oslo_concurrency.lockutils [req-82550334-e062-422a-922a-df173f5b6ec7 req-d1f857f1-a6c3-4983-b4a5-c64efa48196e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.768 187212 DEBUG oslo_concurrency.lockutils [req-82550334-e062-422a-922a-df173f5b6ec7 req-d1f857f1-a6c3-4983-b4a5-c64efa48196e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.768 187212 DEBUG oslo_concurrency.lockutils [req-82550334-e062-422a-922a-df173f5b6ec7 req-d1f857f1-a6c3-4983-b4a5-c64efa48196e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.768 187212 DEBUG nova.compute.manager [req-82550334-e062-422a-922a-df173f5b6ec7 req-d1f857f1-a6c3-4983-b4a5-c64efa48196e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] No waiting events found dispatching network-vif-unplugged-909107ba-c90a-4004-a47f-e5367cab8f82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.768 187212 DEBUG nova.compute.manager [req-82550334-e062-422a-922a-df173f5b6ec7 req-d1f857f1-a6c3-4983-b4a5-c64efa48196e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-vif-unplugged-909107ba-c90a-4004-a47f-e5367cab8f82 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.801 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.801 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:03:58 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [NOTICE]   (221287) : haproxy version is 2.8.14-c23fe91
Dec  5 07:03:58 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [NOTICE]   (221287) : path to executable is /usr/sbin/haproxy
Dec  5 07:03:58 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [WARNING]  (221287) : Exiting Master process...
Dec  5 07:03:58 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [WARNING]  (221287) : Exiting Master process...
Dec  5 07:03:58 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [ALERT]    (221287) : Current worker (221289) exited with code 143 (Terminated)
Dec  5 07:03:58 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[221283]: [WARNING]  (221287) : All workers exited. Exiting... (0)
Dec  5 07:03:58 np0005546909 systemd[1]: libpod-a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b.scope: Deactivated successfully.
Dec  5 07:03:58 np0005546909 podman[221943]: 2025-12-05 12:03:58.870588721 +0000 UTC m=+0.048597859 container died a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:03:58 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b-userdata-shm.mount: Deactivated successfully.
Dec  5 07:03:58 np0005546909 systemd[1]: var-lib-containers-storage-overlay-aba16f9d308e86ed666c24a9528a5e9f58db6e3fb9b48c984c73f0766f63478d-merged.mount: Deactivated successfully.
Dec  5 07:03:58 np0005546909 NetworkManager[55691]: <info>  [1764936238.9099] manager: (tap9dc35efb-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/134)
Dec  5 07:03:58 np0005546909 podman[221943]: 2025-12-05 12:03:58.917073199 +0000 UTC m=+0.095082347 container cleanup a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.939 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:58 np0005546909 systemd[1]: libpod-conmon-a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b.scope: Deactivated successfully.
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.977 187212 DEBUG nova.compute.provider_tree [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.983 187212 INFO nova.virt.libvirt.driver [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Instance destroyed successfully.#033[00m
Dec  5 07:03:58 np0005546909 nova_compute[187208]: 2025-12-05 12:03:58.984 187212 DEBUG nova.objects.instance [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'resources' on Instance uuid 30a55909-059f-4a0c-9598-14cc506d42a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:03:59 np0005546909 podman[221985]: 2025-12-05 12:03:59.005228157 +0000 UTC m=+0.045404088 container remove a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.009 187212 DEBUG nova.virt.libvirt.vif [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1329126976',display_name='tempest-DeleteServersTestJSON-server-1329126976',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1329126976',id=37,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-4bgg3k4a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:56Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=30a55909-059f-4a0c-9598-14cc506d42a2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.009 187212 DEBUG nova.network.os_vif_util [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "address": "fa:16:3e:4b:04:08", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9dc35efb-0a", "ovs_interfaceid": "9dc35efb-0aed-463b-860e-3b60dd65b6db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.010 187212 DEBUG nova.network.os_vif_util [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.011 187212 DEBUG os_vif [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:03:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.011 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0f3038f1-2edc-47e0-9004-a8b6ffe8e291]: (4, ('Fri Dec  5 12:03:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b)\na17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b\nFri Dec  5 12:03:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 (a17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b)\na17da6e8d709cf41a61d3d63eeb335fed9f762311b6a18a6efa7e21e7d3b848b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.013 187212 DEBUG nova.scheduler.client.report [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:03:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.013 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed75716b-4cbd-4fc3-ac8f-8ee88dcd5e45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.014 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.017 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:59 np0005546909 kernel: tapd7360f84-b0: left promiscuous mode
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.017 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9dc35efb-0a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.018 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.019 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.021 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.034 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.037 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:03:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.037 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f73898d2-ea14-436d-a708-470e78749a64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.039 187212 INFO os_vif [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4b:04:08,bridge_name='br-int',has_traffic_filtering=True,id=9dc35efb-0aed-463b-860e-3b60dd65b6db,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9dc35efb-0a')#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.040 187212 INFO nova.virt.libvirt.driver [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Deleting instance files /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2_del#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.041 187212 INFO nova.virt.libvirt.driver [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Deletion of /var/lib/nova/instances/30a55909-059f-4a0c-9598-14cc506d42a2_del complete#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.045 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.051 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936224.049811, c1e2f189-1777-4f28-97ab-72cf0f60fbc0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.051 187212 INFO nova.compute.manager [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:03:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.052 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[46cadb41-3009-4fd9-8fd6-b88aa4e15537]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.055 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[56e5fa8a-0e57-4e96-8a38-caf4100aeeea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.072 187212 DEBUG nova.compute.manager [None req-9f45b4e0-f64a-4d0f-88b2-6c1b1eb5fd2c - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.076 187212 DEBUG nova.compute.manager [None req-9f45b4e0-f64a-4d0f-88b2-6c1b1eb5fd2c - - - - - -] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: suspended, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:03:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.078 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[37eb8854-c0d3-4dca-a361-767f0be1d15c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360573, 'reachable_time': 16088, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222012, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.078 187212 INFO nova.scheduler.client.report [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Deleted allocations for instance d2085dd9-2ebd-4804-99c1-3b15cbd216f8#033[00m
Dec  5 07:03:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.080 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:03:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:03:59.080 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[347ad698-250e-41d2-97c5-10c8d43ab7f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.107 187212 INFO nova.compute.manager [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.107 187212 DEBUG oslo.service.loopingcall [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.109 187212 DEBUG nova.compute.manager [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.109 187212 DEBUG nova.network.neutron [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:03:59 np0005546909 nova_compute[187208]: 2025-12-05 12:03:59.191 187212 DEBUG oslo_concurrency.lockutils [None req-45100e2e-2562-49cc-ab2b-f6389fe433c8 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "d2085dd9-2ebd-4804-99c1-3b15cbd216f8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:03:59 np0005546909 systemd[1]: run-netns-ovnmeta\x2dd7360f84\x2dbcd5\x2d4e64\x2dbf43\x2d1fdbd8215a70.mount: Deactivated successfully.
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.032 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.063 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.064 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.064 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.064 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.065 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.066 187212 INFO nova.compute.manager [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Terminating instance#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.067 187212 DEBUG nova.compute.manager [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.074 187212 INFO nova.virt.libvirt.driver [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Instance destroyed successfully.#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.074 187212 DEBUG nova.objects.instance [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'resources' on Instance uuid c1e2f189-1777-4f28-97ab-72cf0f60fbc0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.087 187212 DEBUG nova.virt.libvirt.vif [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:03:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1049650520',display_name='tempest-ImagesTestJSON-server-1049650520',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-1049650520',id=36,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:03:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-kquxoeat',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:03:55Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=c1e2f189-1777-4f28-97ab-72cf0f60fbc0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.087 187212 DEBUG nova.network.os_vif_util [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "address": "fa:16:3e:57:88:7f", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapecec1a41-6f", "ovs_interfaceid": "ecec1a41-6f3e-4852-8cdb-9d461eded987", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.088 187212 DEBUG nova.network.os_vif_util [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.088 187212 DEBUG os_vif [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.090 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.090 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapecec1a41-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.091 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.094 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.096 187212 INFO os_vif [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:57:88:7f,bridge_name='br-int',has_traffic_filtering=True,id=ecec1a41-6f3e-4852-8cdb-9d461eded987,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapecec1a41-6f')#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.097 187212 INFO nova.virt.libvirt.driver [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Deleting instance files /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0_del#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.098 187212 INFO nova.virt.libvirt.driver [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Deletion of /var/lib/nova/instances/c1e2f189-1777-4f28-97ab-72cf0f60fbc0_del complete#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.137 187212 DEBUG nova.compute.manager [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Received event network-vif-deleted-0a11e563-2be9-4ce9-af51-7d29b586e233 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.137 187212 DEBUG nova.compute.manager [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-vif-unplugged-4f7ea95e-e59f-4941-83b6-5c482617a975 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.137 187212 DEBUG oslo_concurrency.lockutils [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.138 187212 DEBUG oslo_concurrency.lockutils [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.138 187212 DEBUG oslo_concurrency.lockutils [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.138 187212 DEBUG nova.compute.manager [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] No waiting events found dispatching network-vif-unplugged-4f7ea95e-e59f-4941-83b6-5c482617a975 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.139 187212 DEBUG nova.compute.manager [req-a8933418-410d-45b6-b00f-5913ad9b37e1 req-f5f103e2-c776-4983-bab3-9df1fc01e77c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-vif-unplugged-4f7ea95e-e59f-4941-83b6-5c482617a975 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.178 187212 INFO nova.compute.manager [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Took 0.11 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.179 187212 DEBUG oslo.service.loopingcall [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.179 187212 DEBUG nova.compute.manager [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.180 187212 DEBUG nova.network.neutron [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.891 187212 DEBUG nova.compute.manager [req-89d5384b-756b-44c0-a605-fad8d61254e9 req-1745ab2b-c133-4d6c-9a35-27833e868beb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.891 187212 DEBUG oslo_concurrency.lockutils [req-89d5384b-756b-44c0-a605-fad8d61254e9 req-1745ab2b-c133-4d6c-9a35-27833e868beb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.892 187212 DEBUG oslo_concurrency.lockutils [req-89d5384b-756b-44c0-a605-fad8d61254e9 req-1745ab2b-c133-4d6c-9a35-27833e868beb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.892 187212 DEBUG oslo_concurrency.lockutils [req-89d5384b-756b-44c0-a605-fad8d61254e9 req-1745ab2b-c133-4d6c-9a35-27833e868beb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.892 187212 DEBUG nova.compute.manager [req-89d5384b-756b-44c0-a605-fad8d61254e9 req-1745ab2b-c133-4d6c-9a35-27833e868beb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] No waiting events found dispatching network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.893 187212 WARNING nova.compute.manager [req-89d5384b-756b-44c0-a605-fad8d61254e9 req-1745ab2b-c133-4d6c-9a35-27833e868beb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received unexpected event network-vif-plugged-909107ba-c90a-4004-a47f-e5367cab8f82 for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.979 187212 DEBUG nova.network.neutron [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:04:00 np0005546909 nova_compute[187208]: 2025-12-05 12:04:00.998 187212 INFO nova.compute.manager [-] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Took 0.82 seconds to deallocate network for instance.#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.064 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.065 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.094 187212 DEBUG nova.network.neutron [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.123 187212 INFO nova.compute.manager [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Took 2.77 seconds to deallocate network for instance.#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.188 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.192 187212 DEBUG nova.compute.provider_tree [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.212 187212 DEBUG nova.scheduler.client.report [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.235 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.237 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.258 187212 INFO nova.scheduler.client.report [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Deleted allocations for instance c1e2f189-1777-4f28-97ab-72cf0f60fbc0#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.322 187212 DEBUG oslo_concurrency.lockutils [None req-d1da7d75-0ee0-4ca3-8c78-ca2763ec3fe6 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c1e2f189-1777-4f28-97ab-72cf0f60fbc0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.344 187212 DEBUG nova.compute.provider_tree [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.362 187212 DEBUG nova.scheduler.client.report [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.382 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.409 187212 INFO nova.scheduler.client.report [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Deleted allocations for instance 7df02f69-ecc9-424d-82ab-dc8ba279ffd5#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.457 187212 DEBUG nova.network.neutron [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.487 187212 DEBUG oslo_concurrency.lockutils [None req-7b157888-e1d1-4c6a-b3f3-679e7fc1a113 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "7df02f69-ecc9-424d-82ab-dc8ba279ffd5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.863s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.489 187212 INFO nova.compute.manager [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Took 2.73 seconds to deallocate network for instance.#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.537 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.537 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.604 187212 DEBUG nova.compute.provider_tree [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.618 187212 DEBUG nova.scheduler.client.report [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.638 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.664 187212 INFO nova.scheduler.client.report [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Deleted allocations for instance 456f1972-6ed7-4fc2-b046-fa035704d434#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.730 187212 DEBUG oslo_concurrency.lockutils [None req-22af5c76-8eb3-48de-b957-08d3f155e3d6 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.812 187212 DEBUG nova.network.neutron [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.832 187212 INFO nova.compute.manager [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Took 2.72 seconds to deallocate network for instance.#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.884 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.884 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.941 187212 DEBUG nova.compute.provider_tree [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.956 187212 DEBUG nova.scheduler.client.report [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:04:01 np0005546909 nova_compute[187208]: 2025-12-05 12:04:01.977 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.020 187212 INFO nova.scheduler.client.report [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Deleted allocations for instance 30a55909-059f-4a0c-9598-14cc506d42a2#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.076 187212 DEBUG oslo_concurrency.lockutils [None req-5c7359ec-3bb8-43ef-bd39-06f37bf8b3db ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.400s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:02 np0005546909 podman[222013]: 2025-12-05 12:04:02.224045196 +0000 UTC m=+0.070430243 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125)
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.290 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.291 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.292 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.292 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "456f1972-6ed7-4fc2-b046-fa035704d434-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.292 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] No waiting events found dispatching network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.292 187212 WARNING nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received unexpected event network-vif-plugged-4f7ea95e-e59f-4941-83b6-5c482617a975 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.292 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received event network-vif-unplugged-9dc35efb-0aed-463b-860e-3b60dd65b6db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.293 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.293 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.293 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.293 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] No waiting events found dispatching network-vif-unplugged-9dc35efb-0aed-463b-860e-3b60dd65b6db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.293 187212 WARNING nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received unexpected event network-vif-unplugged-9dc35efb-0aed-463b-860e-3b60dd65b6db for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.294 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Received event network-vif-deleted-909107ba-c90a-4004-a47f-e5367cab8f82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.294 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Received event network-vif-deleted-4f7ea95e-e59f-4941-83b6-5c482617a975 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.294 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.294 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.294 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.295 187212 DEBUG oslo_concurrency.lockutils [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "30a55909-059f-4a0c-9598-14cc506d42a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.295 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] No waiting events found dispatching network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.295 187212 WARNING nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received unexpected event network-vif-plugged-9dc35efb-0aed-463b-860e-3b60dd65b6db for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.295 187212 DEBUG nova.compute.manager [req-a7ef210c-019b-415f-9552-a7ca820b3c22 req-3fc5315c-86e3-4986-b655-e5f3851f9ea2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Received event network-vif-deleted-9dc35efb-0aed-463b-860e-3b60dd65b6db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:02 np0005546909 nova_compute[187208]: 2025-12-05 12:04:02.562 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:03.011 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:03.011 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:03.012 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:03 np0005546909 nova_compute[187208]: 2025-12-05 12:04:03.217 187212 DEBUG nova.compute.manager [req-215b72bb-4ffe-4c6d-a0a3-d3e74120ce59 req-e42ff978-79bf-4066-b40e-cadb44e32c1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c1e2f189-1777-4f28-97ab-72cf0f60fbc0] Received event network-vif-deleted-ecec1a41-6f3e-4852-8cdb-9d461eded987 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:04 np0005546909 nova_compute[187208]: 2025-12-05 12:04:04.890 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:04 np0005546909 nova_compute[187208]: 2025-12-05 12:04:04.890 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:04 np0005546909 nova_compute[187208]: 2025-12-05 12:04:04.912 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:04:04 np0005546909 nova_compute[187208]: 2025-12-05 12:04:04.971 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:04 np0005546909 nova_compute[187208]: 2025-12-05 12:04:04.972 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:04 np0005546909 nova_compute[187208]: 2025-12-05 12:04:04.978 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:04:04 np0005546909 nova_compute[187208]: 2025-12-05 12:04:04.978 187212 INFO nova.compute.claims [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.087 187212 DEBUG nova.compute.provider_tree [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.092 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.105 187212 DEBUG nova.scheduler.client.report [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.125 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.126 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.174 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.174 187212 DEBUG nova.network.neutron [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.196 187212 INFO nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.216 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.289 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.290 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.290 187212 INFO nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Creating image(s)#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.291 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "/var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.291 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.291 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.302 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.398 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.399 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.400 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.423 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.484 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936230.4838316, d70544d6-04e3-4b2a-914a-72db3052216a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.486 187212 INFO nova.compute.manager [-] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.490 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.491 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.578 187212 DEBUG nova.compute.manager [None req-97c0c5d6-0a43-4dbe-a0cf-cec4a6ad6c1e - - - - - -] [instance: d70544d6-04e3-4b2a-914a-72db3052216a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.919 187212 DEBUG nova.policy [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.963 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk 1073741824" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.964 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:05 np0005546909 nova_compute[187208]: 2025-12-05 12:04:05.965 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.027 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.029 187212 DEBUG nova.virt.disk.api [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Checking if we can resize image /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.029 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.103 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.104 187212 DEBUG nova.virt.disk.api [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Cannot resize image /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.104 187212 DEBUG nova.objects.instance [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'migration_context' on Instance uuid e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.121 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.122 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Ensure instance console log exists: /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.122 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.122 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.123 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.510 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.510 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.532 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.600 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.601 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.608 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.608 187212 INFO nova.compute.claims [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.719 187212 DEBUG nova.compute.provider_tree [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.734 187212 DEBUG nova.scheduler.client.report [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.769 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.770 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.839 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.840 187212 DEBUG nova.network.neutron [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.881 187212 INFO nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:04:06 np0005546909 nova_compute[187208]: 2025-12-05 12:04:06.908 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.008 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.009 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.010 187212 INFO nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Creating image(s)#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.010 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "/var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.010 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.011 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "/var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.026 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.068 187212 DEBUG nova.policy [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff425b7b04144f93a2c15e3a347fc15c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4671f6c82ea049fab3a314ecf45b7656', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.087 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.088 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.088 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.101 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.165 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.166 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.201 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.202 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.202 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.265 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.266 187212 DEBUG nova.virt.disk.api [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Checking if we can resize image /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.267 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.338 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.339 187212 DEBUG nova.virt.disk.api [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Cannot resize image /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.339 187212 DEBUG nova.objects.instance [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'migration_context' on Instance uuid c2e63727-b45b-4249-a94f-85b0d6314ba0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.354 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.355 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Ensure instance console log exists: /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.355 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.356 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.356 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.564 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:07 np0005546909 nova_compute[187208]: 2025-12-05 12:04:07.674 187212 DEBUG nova.network.neutron [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Successfully created port: 656f63d2-77f9-46f7-9338-81bc5a056ad4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:04:08 np0005546909 nova_compute[187208]: 2025-12-05 12:04:08.557 187212 DEBUG nova.network.neutron [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Successfully created port: ea8794b1-8d29-4839-af08-e1675802ea0a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.422 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "082d2145-1505-4170-9a11-4e46bf86fed2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.422 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "082d2145-1505-4170-9a11-4e46bf86fed2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.454 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.480 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "58c3288f-57bf-4c62-8d69-9842a22e43d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.480 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "58c3288f-57bf-4c62-8d69-9842a22e43d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.492 187212 DEBUG nova.network.neutron [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Successfully updated port: 656f63d2-77f9-46f7-9338-81bc5a056ad4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.516 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "refresh_cache-c2e63727-b45b-4249-a94f-85b0d6314ba0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.516 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquired lock "refresh_cache-c2e63727-b45b-4249-a94f-85b0d6314ba0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.516 187212 DEBUG nova.network.neutron [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.518 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.537 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.537 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.553 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.553 187212 INFO nova.compute.claims [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.637 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.805 187212 DEBUG nova.compute.provider_tree [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.823 187212 DEBUG nova.scheduler.client.report [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.842 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.843 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.845 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.852 187212 DEBUG nova.virt.hardware [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.852 187212 INFO nova.compute.claims [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.878 187212 DEBUG nova.network.neutron [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.934 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.935 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.967 187212 INFO nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:04:09 np0005546909 nova_compute[187208]: 2025-12-05 12:04:09.984 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.060 187212 DEBUG nova.compute.provider_tree [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.078 187212 DEBUG nova.scheduler.client.report [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.084 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.085 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.085 187212 INFO nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Creating image(s)#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.086 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "/var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.086 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.087 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.105 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.105 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.108 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.174 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.175 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.175 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.196 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.219 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.220 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:04:10 np0005546909 podman[222065]: 2025-12-05 12:04:10.236112817 +0000 UTC m=+0.088588222 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.245 187212 INFO nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.258 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.259 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.278 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.293 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.294 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.295 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.355 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.356 187212 DEBUG nova.virt.disk.api [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Checking if we can resize image /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.357 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.390 187212 DEBUG nova.compute.manager [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.391 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.392 187212 INFO nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Creating image(s)#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.393 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "/var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.393 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.394 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "/var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.407 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.428 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.429 187212 DEBUG nova.virt.disk.api [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Cannot resize image /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.429 187212 DEBUG nova.objects.instance [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'migration_context' on Instance uuid 082d2145-1505-4170-9a11-4e46bf86fed2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.453 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.454 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Ensure instance console log exists: /var/lib/nova/instances/082d2145-1505-4170-9a11-4e46bf86fed2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.454 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.455 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.455 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.470 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.470 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.471 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.484 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.540 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.541 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.582 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.583 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.584 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.654 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.655 187212 DEBUG nova.virt.disk.api [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Checking if we can resize image /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.656 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.724 187212 DEBUG oslo_concurrency.processutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.725 187212 DEBUG nova.virt.disk.api [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Cannot resize image /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.725 187212 DEBUG nova.objects.instance [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lazy-loading 'migration_context' on Instance uuid 58c3288f-57bf-4c62-8d69-9842a22e43d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.741 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.742 187212 DEBUG nova.virt.libvirt.driver [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Ensure instance console log exists: /var/lib/nova/instances/58c3288f-57bf-4c62-8d69-9842a22e43d6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.742 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.743 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.743 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.842 187212 DEBUG nova.policy [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40620135b1ff4f8d9d80eb79f51fd593', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bebbbd9623064681bb9350747fba600e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:04:10 np0005546909 nova_compute[187208]: 2025-12-05 12:04:10.987 187212 DEBUG nova.policy [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40620135b1ff4f8d9d80eb79f51fd593', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bebbbd9623064681bb9350747fba600e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.122 187212 DEBUG nova.network.neutron [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Updating instance_info_cache with network_info: [{"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.146 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Releasing lock "refresh_cache-c2e63727-b45b-4249-a94f-85b0d6314ba0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.147 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Instance network_info: |[{"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.150 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Start _get_guest_xml network_info=[{"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.156 187212 WARNING nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.163 187212 DEBUG nova.virt.libvirt.host [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.164 187212 DEBUG nova.virt.libvirt.host [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.168 187212 DEBUG nova.virt.libvirt.host [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.169 187212 DEBUG nova.virt.libvirt.host [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.170 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.170 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.171 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.172 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.172 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.173 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.173 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.173 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.174 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.174 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.175 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.175 187212 DEBUG nova.virt.hardware [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.183 187212 DEBUG nova.virt.libvirt.vif [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-64428055',display_name='tempest-DeleteServersTestJSON-server-64428055',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-64428055',id=41,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-rpqd5t4m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:06Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=c2e63727-b45b-4249-a94f-85b0d6314ba0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.183 187212 DEBUG nova.network.os_vif_util [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.185 187212 DEBUG nova.network.os_vif_util [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:8d:59,bridge_name='br-int',has_traffic_filtering=True,id=656f63d2-77f9-46f7-9338-81bc5a056ad4,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap656f63d2-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.186 187212 DEBUG nova.objects.instance [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'pci_devices' on Instance uuid c2e63727-b45b-4249-a94f-85b0d6314ba0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.231 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  <uuid>c2e63727-b45b-4249-a94f-85b0d6314ba0</uuid>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  <name>instance-00000029</name>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <nova:name>tempest-DeleteServersTestJSON-server-64428055</nova:name>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:04:11</nova:creationTime>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:        <nova:user uuid="ff425b7b04144f93a2c15e3a347fc15c">tempest-DeleteServersTestJSON-554028480-project-member</nova:user>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:        <nova:project uuid="4671f6c82ea049fab3a314ecf45b7656">tempest-DeleteServersTestJSON-554028480</nova:project>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:        <nova:port uuid="656f63d2-77f9-46f7-9338-81bc5a056ad4">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <entry name="serial">c2e63727-b45b-4249-a94f-85b0d6314ba0</entry>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <entry name="uuid">c2e63727-b45b-4249-a94f-85b0d6314ba0</entry>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.config"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:64:8d:59"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <target dev="tap656f63d2-77"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/console.log" append="off"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:04:11 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:04:11 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:04:11 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:04:11 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.233 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Preparing to wait for external event network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.234 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.234 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.234 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.235 187212 DEBUG nova.virt.libvirt.vif [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-64428055',display_name='tempest-DeleteServersTestJSON-server-64428055',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-64428055',id=41,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-rpqd5t4m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:06Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=c2e63727-b45b-4249-a94f-85b0d6314ba0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.235 187212 DEBUG nova.network.os_vif_util [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.236 187212 DEBUG nova.network.os_vif_util [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:64:8d:59,bridge_name='br-int',has_traffic_filtering=True,id=656f63d2-77f9-46f7-9338-81bc5a056ad4,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap656f63d2-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.236 187212 DEBUG os_vif [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:8d:59,bridge_name='br-int',has_traffic_filtering=True,id=656f63d2-77f9-46f7-9338-81bc5a056ad4,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap656f63d2-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.237 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.237 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.238 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.241 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.241 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap656f63d2-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.242 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap656f63d2-77, col_values=(('external_ids', {'iface-id': '656f63d2-77f9-46f7-9338-81bc5a056ad4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:64:8d:59', 'vm-uuid': 'c2e63727-b45b-4249-a94f-85b0d6314ba0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.244 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:11 np0005546909 NetworkManager[55691]: <info>  [1764936251.2457] manager: (tap656f63d2-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.246 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.250 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.251 187212 INFO os_vif [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:64:8d:59,bridge_name='br-int',has_traffic_filtering=True,id=656f63d2-77f9-46f7-9338-81bc5a056ad4,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap656f63d2-77')#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.316 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.317 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.317 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] No VIF found with MAC fa:16:3e:64:8d:59, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.318 187212 INFO nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Using config drive#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.426 187212 DEBUG nova.compute.manager [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received event network-changed-656f63d2-77f9-46f7-9338-81bc5a056ad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.426 187212 DEBUG nova.compute.manager [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Refreshing instance network info cache due to event network-changed-656f63d2-77f9-46f7-9338-81bc5a056ad4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.427 187212 DEBUG oslo_concurrency.lockutils [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-c2e63727-b45b-4249-a94f-85b0d6314ba0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.427 187212 DEBUG oslo_concurrency.lockutils [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-c2e63727-b45b-4249-a94f-85b0d6314ba0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.427 187212 DEBUG nova.network.neutron [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Refreshing network info cache for port 656f63d2-77f9-46f7-9338-81bc5a056ad4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.497 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936236.4959307, d2085dd9-2ebd-4804-99c1-3b15cbd216f8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.498 187212 INFO nova.compute.manager [-] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.523 187212 DEBUG nova.compute.manager [None req-998c5f52-b9c3-4d11-90cc-5fb77ebc3a65 - - - - - -] [instance: d2085dd9-2ebd-4804-99c1-3b15cbd216f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.659 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "a7616662-639b-4642-b507-614773f4748f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.659 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "a7616662-639b-4642-b507-614773f4748f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.683 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.758 187212 INFO nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Creating config drive at /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.config#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.763 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiod01ckn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.805 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.806 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.811 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.812 187212 INFO nova.compute.claims [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.895 187212 DEBUG oslo_concurrency.processutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c2e63727-b45b-4249-a94f-85b0d6314ba0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpiod01ckn" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:11 np0005546909 kernel: tap656f63d2-77: entered promiscuous mode
Dec  5 07:04:11 np0005546909 NetworkManager[55691]: <info>  [1764936251.9793] manager: (tap656f63d2-77): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Dec  5 07:04:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:11Z|00327|binding|INFO|Claiming lport 656f63d2-77f9-46f7-9338-81bc5a056ad4 for this chassis.
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.979 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:11Z|00328|binding|INFO|656f63d2-77f9-46f7-9338-81bc5a056ad4: Claiming fa:16:3e:64:8d:59 10.100.0.4
Dec  5 07:04:11 np0005546909 nova_compute[187208]: 2025-12-05 12:04:11.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:11.996 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:64:8d:59 10.100.0.4'], port_security=['fa:16:3e:64:8d:59 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c2e63727-b45b-4249-a94f-85b0d6314ba0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4671f6c82ea049fab3a314ecf45b7656', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9971cccf-0c8a-4b37-8acd-5568216c48d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b8386cb-64ba-481e-822e-b4855ceb419b, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=656f63d2-77f9-46f7-9338-81bc5a056ad4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:04:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:11.998 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 656f63d2-77f9-46f7-9338-81bc5a056ad4 in datapath d7360f84-bcd5-4e64-bf43-1fdbd8215a70 bound to our chassis#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.001 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7360f84-bcd5-4e64-bf43-1fdbd8215a70#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.006 187212 DEBUG nova.compute.provider_tree [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:04:12 np0005546909 systemd-udevd[222137]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.012 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[827d641f-1218-4e0f-b404-40489ceaabe8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.013 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7360f84-b1 in ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.015 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7360f84-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.015 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4053045a-d20d-496b-89dc-1ee2a6998965]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.016 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f6bacca0-4a7f-48dd-95f7-d161724c7952]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.022 187212 DEBUG nova.scheduler.client.report [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:04:12 np0005546909 NetworkManager[55691]: <info>  [1764936252.0261] device (tap656f63d2-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:04:12 np0005546909 NetworkManager[55691]: <info>  [1764936252.0269] device (tap656f63d2-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.027 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[ee696fed-b54a-44c0-aad1-6543dd43b8ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.039 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.040 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:04:12 np0005546909 systemd-machined[153543]: New machine qemu-44-instance-00000029.
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.053 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[97081fb8-d8a4-4638-8bc8-769d6698ffae]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.054 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:12Z|00329|binding|INFO|Setting lport 656f63d2-77f9-46f7-9338-81bc5a056ad4 ovn-installed in OVS
Dec  5 07:04:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:12Z|00330|binding|INFO|Setting lport 656f63d2-77f9-46f7-9338-81bc5a056ad4 up in Southbound
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.062 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:12 np0005546909 systemd[1]: Started Virtual Machine qemu-44-instance-00000029.
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.087 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.087 187212 DEBUG nova.network.neutron [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.090 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4adc01fc-87ee-4510-b515-476ff7504326]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.096 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f41ed1a-8ea6-4828-9c46-ac5c7ee8123d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 NetworkManager[55691]: <info>  [1764936252.0977] manager: (tapd7360f84-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/137)
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.104 187212 INFO nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.125 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.135 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1e09710e-3366-4258-b298-b87110ece5ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.138 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[39643e0e-5076-4a77-9961-0fa6c17ae863]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 NetworkManager[55691]: <info>  [1764936252.1639] device (tapd7360f84-b0): carrier: link connected
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.170 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[20455a17-2bc9-4a55-bd24-760c9017ba9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.190 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b93f67-56b2-465d-b466-03319837492e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363272, 'reachable_time': 36535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222172, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.209 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[db14cc24-2f69-4a6f-8ff7-4a63b2fcebd1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:2b52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363272, 'tstamp': 363272}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222173, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.225 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.226 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.227 187212 INFO nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Creating image(s)#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.227 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "/var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.228 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "/var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.228 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "/var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.229 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4a4b5eaf-0dc4-4b00-8305-82ca0236c4c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7360f84-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:2b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363272, 'reachable_time': 36535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222174, 'error': None, 'target': 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.240 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.260 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[46d8c3d2-9bb8-4b66-8b76-f7377c978460]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.303 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.304 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.305 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.316 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.328 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0c4571e9-1059-4be1-8571-555340e4ef0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.330 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7360f84-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.330 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.330 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7360f84-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:12 np0005546909 kernel: tapd7360f84-b0: entered promiscuous mode
Dec  5 07:04:12 np0005546909 NetworkManager[55691]: <info>  [1764936252.3330] manager: (tapd7360f84-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.335 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7360f84-b0, col_values=(('external_ids', {'iface-id': 'd85bc323-c3ce-47e3-ac1f-d5f27467a4e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:12Z|00331|binding|INFO|Releasing lport d85bc323-c3ce-47e3-ac1f-d5f27467a4e9 from this chassis (sb_readonly=0)
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.337 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.345 187212 DEBUG nova.policy [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ee170bdfdd343189ee1da01bdb80be6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '79895287bd1d488c842f6013729a1f81', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.346 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[81ce1368-e548-4394-944c-4234f2e569b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.347 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.pid.haproxy
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID d7360f84-bcd5-4e64-bf43-1fdbd8215a70
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:04:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:12.348 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'env', 'PROCESS_TAG=haproxy-d7360f84-bcd5-4e64-bf43-1fdbd8215a70', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7360f84-bcd5-4e64-bf43-1fdbd8215a70.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.349 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.392 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.393 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.418 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936252.4066186, c2e63727-b45b-4249-a94f-85b0d6314ba0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.419 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] VM Started (Lifecycle Event)#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.437 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.438 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.439 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.466 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.473 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936252.4067945, c2e63727-b45b-4249-a94f-85b0d6314ba0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.474 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.491 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.496 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.507 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.508 187212 DEBUG nova.virt.disk.api [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Checking if we can resize image /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.508 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.531 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.568 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.575 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.577 187212 DEBUG nova.virt.disk.api [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Cannot resize image /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.577 187212 DEBUG nova.objects.instance [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'migration_context' on Instance uuid a7616662-639b-4642-b507-614773f4748f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.591 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.592 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Ensure instance console log exists: /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.593 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.593 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.594 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:12 np0005546909 podman[222228]: 2025-12-05 12:04:12.717815053 +0000 UTC m=+0.023616216 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:04:12 np0005546909 podman[222228]: 2025-12-05 12:04:12.819766194 +0000 UTC m=+0.125567337 container create 720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:04:12 np0005546909 systemd[1]: Started libpod-conmon-720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3.scope.
Dec  5 07:04:12 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:04:12 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1e3820f746877bfac04dbdf959f91cdf0863ffd037e2db606c73901d85cc260/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:04:12 np0005546909 podman[222228]: 2025-12-05 12:04:12.911437064 +0000 UTC m=+0.217238227 container init 720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.910 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936237.9096906, 7df02f69-ecc9-424d-82ab-dc8ba279ffd5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.913 187212 INFO nova.compute.manager [-] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:04:12 np0005546909 podman[222228]: 2025-12-05 12:04:12.918008951 +0000 UTC m=+0.223810094 container start 720d92f48abaca25ac8c7b3dfa774e4fecd8a0007a3e26faf0c6c5bcc6d160b3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec  5 07:04:12 np0005546909 nova_compute[187208]: 2025-12-05 12:04:12.937 187212 DEBUG nova.compute.manager [None req-72c91a9b-f3cc-4d63-b202-414439fa12b2 - - - - - -] [instance: 7df02f69-ecc9-424d-82ab-dc8ba279ffd5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:12 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[222243]: [NOTICE]   (222247) : New worker (222249) forked
Dec  5 07:04:12 np0005546909 neutron-haproxy-ovnmeta-d7360f84-bcd5-4e64-bf43-1fdbd8215a70[222243]: [NOTICE]   (222247) : Loading success.
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.114 187212 DEBUG nova.network.neutron [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Successfully updated port: ea8794b1-8d29-4839-af08-e1675802ea0a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.150 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "refresh_cache-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.151 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquired lock "refresh_cache-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.152 187212 DEBUG nova.network.neutron [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.624 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936238.6229408, 456f1972-6ed7-4fc2-b046-fa035704d434 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.625 187212 INFO nova.compute.manager [-] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.667 187212 DEBUG nova.compute.manager [None req-c087fbc8-c64a-49ba-8fc4-a32eb330d7aa - - - - - -] [instance: 456f1972-6ed7-4fc2-b046-fa035704d434] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.699 187212 DEBUG nova.network.neutron [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Successfully created port: 539a9707-ef82-4c64-aec4-3759222680f0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.793 187212 DEBUG nova.compute.manager [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Received event network-changed-ea8794b1-8d29-4839-af08-e1675802ea0a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.793 187212 DEBUG nova.compute.manager [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Refreshing instance network info cache due to event network-changed-ea8794b1-8d29-4839-af08-e1675802ea0a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.794 187212 DEBUG oslo_concurrency.lockutils [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.981 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936238.9804034, 30a55909-059f-4a0c-9598-14cc506d42a2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.983 187212 INFO nova.compute.manager [-] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:04:13 np0005546909 nova_compute[187208]: 2025-12-05 12:04:13.986 187212 DEBUG nova.network.neutron [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.018 187212 DEBUG nova.compute.manager [None req-ef65769c-7c86-41df-b01b-81729892b7d7 - - - - - -] [instance: 30a55909-059f-4a0c-9598-14cc506d42a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.230 187212 DEBUG nova.compute.manager [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received event network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.231 187212 DEBUG oslo_concurrency.lockutils [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.231 187212 DEBUG oslo_concurrency.lockutils [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.231 187212 DEBUG oslo_concurrency.lockutils [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.232 187212 DEBUG nova.compute.manager [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Processing event network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.232 187212 DEBUG nova.compute.manager [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received event network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.232 187212 DEBUG oslo_concurrency.lockutils [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.233 187212 DEBUG oslo_concurrency.lockutils [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.233 187212 DEBUG oslo_concurrency.lockutils [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.233 187212 DEBUG nova.compute.manager [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] No waiting events found dispatching network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.233 187212 WARNING nova.compute.manager [req-50927977-ef36-48ba-ba8b-aac2d73558cd req-efec49e4-315a-4bd4-bb89-dc9d1ba14134 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Received unexpected event network-vif-plugged-656f63d2-77f9-46f7-9338-81bc5a056ad4 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.234 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.239 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936254.238292, c2e63727-b45b-4249-a94f-85b0d6314ba0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.239 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.240 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.251 187212 INFO nova.virt.libvirt.driver [-] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Instance spawned successfully.#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.251 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.268 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.272 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.279 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.280 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.280 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.280 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.281 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.281 187212 DEBUG nova.virt.libvirt.driver [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.294 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.348 187212 INFO nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Took 7.34 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.349 187212 DEBUG nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.420 187212 INFO nova.compute.manager [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Took 7.84 seconds to build instance.#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.424 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Successfully created port: d7b765ff-93e1-4594-9e3c-e177dee2e07b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.436 187212 DEBUG oslo_concurrency.lockutils [None req-23f48977-6dd3-4d66-a767-2aa650c914d5 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.926s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.619 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Successfully created port: eabadaa6-16c4-434c-83ea-96dfa62d7f79 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.682 187212 DEBUG nova.network.neutron [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Successfully updated port: 539a9707-ef82-4c64-aec4-3759222680f0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.712 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "refresh_cache-a7616662-639b-4642-b507-614773f4748f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.712 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquired lock "refresh_cache-a7616662-639b-4642-b507-614773f4748f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:04:14 np0005546909 nova_compute[187208]: 2025-12-05 12:04:14.713 187212 DEBUG nova.network.neutron [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:04:15 np0005546909 nova_compute[187208]: 2025-12-05 12:04:15.198 187212 DEBUG nova.network.neutron [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Updated VIF entry in instance network info cache for port 656f63d2-77f9-46f7-9338-81bc5a056ad4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:04:15 np0005546909 nova_compute[187208]: 2025-12-05 12:04:15.198 187212 DEBUG nova.network.neutron [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Updating instance_info_cache with network_info: [{"id": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "address": "fa:16:3e:64:8d:59", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap656f63d2-77", "ovs_interfaceid": "656f63d2-77f9-46f7-9338-81bc5a056ad4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:04:15 np0005546909 nova_compute[187208]: 2025-12-05 12:04:15.215 187212 DEBUG oslo_concurrency.lockutils [req-3e9dd577-e696-4716-90f5-8aeb86c69701 req-20db1f6c-2de6-4a07-a471-ff8ec99f3c7b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-c2e63727-b45b-4249-a94f-85b0d6314ba0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:04:15 np0005546909 nova_compute[187208]: 2025-12-05 12:04:15.336 187212 DEBUG nova.network.neutron [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.173 187212 DEBUG nova.network.neutron [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Updating instance_info_cache with network_info: [{"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.195 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Releasing lock "refresh_cache-a7616662-639b-4642-b507-614773f4748f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.196 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Instance network_info: |[{"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.200 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Start _get_guest_xml network_info=[{"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.208 187212 WARNING nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.217 187212 DEBUG nova.virt.libvirt.host [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.218 187212 DEBUG nova.virt.libvirt.host [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.223 187212 DEBUG nova.virt.libvirt.host [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.224 187212 DEBUG nova.virt.libvirt.host [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.225 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.225 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.226 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.226 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.227 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.227 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.228 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.228 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.229 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.229 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.229 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.230 187212 DEBUG nova.virt.hardware [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.236 187212 DEBUG nova.virt.libvirt.vif [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1115906111',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1115906111',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1115906111',id=44,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-jo4hn4lg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:12Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=a7616662-639b-4642-b507-614773f4748f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.236 187212 DEBUG nova.network.os_vif_util [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.238 187212 DEBUG nova.network.os_vif_util [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c8:06,bridge_name='br-int',has_traffic_filtering=True,id=539a9707-ef82-4c64-aec4-3759222680f0,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap539a9707-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.240 187212 DEBUG nova.objects.instance [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lazy-loading 'pci_devices' on Instance uuid a7616662-639b-4642-b507-614773f4748f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.293 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  <uuid>a7616662-639b-4642-b507-614773f4748f</uuid>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  <name>instance-0000002c</name>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <nova:name>tempest-ImagesOneServerNegativeTestJSON-server-1115906111</nova:name>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:04:16</nova:creationTime>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:        <nova:user uuid="3ee170bdfdd343189ee1da01bdb80be6">tempest-ImagesOneServerNegativeTestJSON-661137252-project-member</nova:user>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:        <nova:project uuid="79895287bd1d488c842f6013729a1f81">tempest-ImagesOneServerNegativeTestJSON-661137252</nova:project>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:        <nova:port uuid="539a9707-ef82-4c64-aec4-3759222680f0">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <entry name="serial">a7616662-639b-4642-b507-614773f4748f</entry>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <entry name="uuid">a7616662-639b-4642-b507-614773f4748f</entry>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.config"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:d2:c8:06"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <target dev="tap539a9707-ef"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/console.log" append="off"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:04:16 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:04:16 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:04:16 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:04:16 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.295 187212 DEBUG nova.compute.manager [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Preparing to wait for external event network-vif-plugged-539a9707-ef82-4c64-aec4-3759222680f0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.295 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "a7616662-639b-4642-b507-614773f4748f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.295 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.296 187212 DEBUG oslo_concurrency.lockutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "a7616662-639b-4642-b507-614773f4748f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.296 187212 DEBUG nova.virt.libvirt.vif [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerNegativeTestJSON-server-1115906111',display_name='tempest-ImagesOneServerNegativeTestJSON-server-1115906111',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagesoneservernegativetestjson-server-1115906111',id=44,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79895287bd1d488c842f6013729a1f81',ramdisk_id='',reservation_id='r-jo4hn4lg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerNegativeTestJSON-661137252',owner_user_name='tempest-ImagesOneServerNegativeTestJSON-661137252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:12Z,user_data=None,user_id='3ee170bdfdd343189ee1da01bdb80be6',uuid=a7616662-639b-4642-b507-614773f4748f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.297 187212 DEBUG nova.network.os_vif_util [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converting VIF {"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.298 187212 DEBUG nova.network.os_vif_util [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c8:06,bridge_name='br-int',has_traffic_filtering=True,id=539a9707-ef82-4c64-aec4-3759222680f0,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap539a9707-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.298 187212 DEBUG os_vif [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c8:06,bridge_name='br-int',has_traffic_filtering=True,id=539a9707-ef82-4c64-aec4-3759222680f0,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap539a9707-ef') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.299 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.299 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.300 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.300 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.304 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.304 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap539a9707-ef, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.304 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap539a9707-ef, col_values=(('external_ids', {'iface-id': '539a9707-ef82-4c64-aec4-3759222680f0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:c8:06', 'vm-uuid': 'a7616662-639b-4642-b507-614773f4748f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.306 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:16 np0005546909 NetworkManager[55691]: <info>  [1764936256.3087] manager: (tap539a9707-ef): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.310 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.319 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.320 187212 INFO os_vif [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:c8:06,bridge_name='br-int',has_traffic_filtering=True,id=539a9707-ef82-4c64-aec4-3759222680f0,network=Network(5d064000-316c-46a7-a23c-1dc26318b6a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap539a9707-ef')#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.391 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.393 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.393 187212 DEBUG nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] No VIF found with MAC fa:16:3e:d2:c8:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.394 187212 INFO nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Using config drive#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.662 187212 INFO nova.virt.libvirt.driver [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Creating config drive at /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.config#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.668 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpicvz7qwy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.699 187212 DEBUG nova.network.neutron [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Updating instance_info_cache with network_info: [{"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.729 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Releasing lock "refresh_cache-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.729 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Instance network_info: |[{"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.730 187212 DEBUG oslo_concurrency.lockutils [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.731 187212 DEBUG nova.network.neutron [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Refreshing network info cache for port ea8794b1-8d29-4839-af08-e1675802ea0a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.734 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Start _get_guest_xml network_info=[{"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:04:16 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.740 187212 WARNING nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:16.744 187212 DEBUG nova.virt.libvirt.host [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.027 187212 DEBUG nova.virt.libvirt.host [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.031 187212 DEBUG oslo_concurrency.processutils [None req-6b28b14c-0977-4367-bcb4-81426c512a4e 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a7616662-639b-4642-b507-614773f4748f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpicvz7qwy" returned: 0 in 0.362s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.032 187212 DEBUG nova.compute.manager [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Received event network-changed-539a9707-ef82-4c64-aec4-3759222680f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.033 187212 DEBUG nova.compute.manager [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Refreshing instance network info cache due to event network-changed-539a9707-ef82-4c64-aec4-3759222680f0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.033 187212 DEBUG oslo_concurrency.lockutils [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-a7616662-639b-4642-b507-614773f4748f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.033 187212 DEBUG oslo_concurrency.lockutils [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-a7616662-639b-4642-b507-614773f4748f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.033 187212 DEBUG nova.network.neutron [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Refreshing network info cache for port 539a9707-ef82-4c64-aec4-3759222680f0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.053 187212 DEBUG nova.virt.libvirt.host [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.054 187212 DEBUG nova.virt.libvirt.host [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.054 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.054 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.055 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.055 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.055 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.055 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.056 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.056 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.056 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.056 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.056 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.057 187212 DEBUG nova.virt.hardware [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.060 187212 DEBUG nova.virt.libvirt.vif [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-404632133',display_name='tempest-ImagesTestJSON-server-404632133',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-404632133',id=40,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-d69nje92',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:05Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=e5212ff3-c6ed-4f02-99c4-becad0e5f2a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.060 187212 DEBUG nova.network.os_vif_util [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.061 187212 DEBUG nova.network.os_vif_util [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:21:a9,bridge_name='br-int',has_traffic_filtering=True,id=ea8794b1-8d29-4839-af08-e1675802ea0a,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea8794b1-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.062 187212 DEBUG nova.objects.instance [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.075 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  <uuid>e5212ff3-c6ed-4f02-99c4-becad0e5f2a5</uuid>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  <name>instance-00000028</name>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <nova:name>tempest-ImagesTestJSON-server-404632133</nova:name>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:04:16</nova:creationTime>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:        <nova:user uuid="a00ac4435e6647779ffaf4a5cde18fdb">tempest-ImagesTestJSON-276789408-project-member</nova:user>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:        <nova:project uuid="43e63f5c6b0f4840ad4df23fb5c10764">tempest-ImagesTestJSON-276789408</nova:project>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:        <nova:port uuid="ea8794b1-8d29-4839-af08-e1675802ea0a">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <entry name="serial">e5212ff3-c6ed-4f02-99c4-becad0e5f2a5</entry>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <entry name="uuid">e5212ff3-c6ed-4f02-99c4-becad0e5f2a5</entry>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.config"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:58:21:a9"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <target dev="tapea8794b1-8d"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/console.log" append="off"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:04:17 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:04:17 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:04:17 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:04:17 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.076 187212 DEBUG nova.compute.manager [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Preparing to wait for external event network-vif-plugged-ea8794b1-8d29-4839-af08-e1675802ea0a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.077 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.077 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.077 187212 DEBUG oslo_concurrency.lockutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "e5212ff3-c6ed-4f02-99c4-becad0e5f2a5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.078 187212 DEBUG nova.virt.libvirt.vif [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:04:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-404632133',display_name='tempest-ImagesTestJSON-server-404632133',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-404632133',id=40,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-d69nje92',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:04:05Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=e5212ff3-c6ed-4f02-99c4-becad0e5f2a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.078 187212 DEBUG nova.network.os_vif_util [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.078 187212 DEBUG nova.network.os_vif_util [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:21:a9,bridge_name='br-int',has_traffic_filtering=True,id=ea8794b1-8d29-4839-af08-e1675802ea0a,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea8794b1-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.079 187212 DEBUG os_vif [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:21:a9,bridge_name='br-int',has_traffic_filtering=True,id=ea8794b1-8d29-4839-af08-e1675802ea0a,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea8794b1-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.079 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.079 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.080 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.082 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.083 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea8794b1-8d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.083 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapea8794b1-8d, col_values=(('external_ids', {'iface-id': 'ea8794b1-8d29-4839-af08-e1675802ea0a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:21:a9', 'vm-uuid': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.084 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 NetworkManager[55691]: <info>  [1764936257.0862] manager: (tapea8794b1-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.087 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.094 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.096 187212 INFO os_vif [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:21:a9,bridge_name='br-int',has_traffic_filtering=True,id=ea8794b1-8d29-4839-af08-e1675802ea0a,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea8794b1-8d')#033[00m
Dec  5 07:04:17 np0005546909 kernel: tap539a9707-ef: entered promiscuous mode
Dec  5 07:04:17 np0005546909 NetworkManager[55691]: <info>  [1764936257.1284] manager: (tap539a9707-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/141)
Dec  5 07:04:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:17Z|00332|binding|INFO|Claiming lport 539a9707-ef82-4c64-aec4-3759222680f0 for this chassis.
Dec  5 07:04:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:17Z|00333|binding|INFO|539a9707-ef82-4c64-aec4-3759222680f0: Claiming fa:16:3e:d2:c8:06 10.100.0.4
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.139 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.148 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:c8:06 10.100.0.4'], port_security=['fa:16:3e:d2:c8:06 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a7616662-639b-4642-b507-614773f4748f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d064000-316c-46a7-a23c-1dc26318b6a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79895287bd1d488c842f6013729a1f81', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e1ec2415-6840-4cf9-b5ac-efaf1a9c9a58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3804b014-203a-4c47-b0bb-7634579c4ec4, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=539a9707-ef82-4c64-aec4-3759222680f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.149 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 539a9707-ef82-4c64-aec4-3759222680f0 in datapath 5d064000-316c-46a7-a23c-1dc26318b6a4 bound to our chassis#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.151 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5d064000-316c-46a7-a23c-1dc26318b6a4#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.154 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.155 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.155 187212 DEBUG nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No VIF found with MAC fa:16:3e:58:21:a9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.156 187212 INFO nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Using config drive#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.165 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1045f622-c6e5-4686-8e90-4ca1b10a6a45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.166 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5d064000-31 in ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:04:17 np0005546909 systemd-machined[153543]: New machine qemu-45-instance-0000002c.
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.169 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5d064000-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.169 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[13f822b2-1798-4604-97f8-68b575def2b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.171 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f9429717-693f-4baf-8373-94f9464c46e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.186 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8819262e-8cd9-402a-ad33-33294532ed67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 systemd[1]: Started Virtual Machine qemu-45-instance-0000002c.
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:17Z|00334|binding|INFO|Releasing lport d85bc323-c3ce-47e3-ac1f-d5f27467a4e9 from this chassis (sb_readonly=0)
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.214 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[803881da-3756-455a-97d6-7e94dc46761b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:17Z|00335|binding|INFO|Setting lport 539a9707-ef82-4c64-aec4-3759222680f0 ovn-installed in OVS
Dec  5 07:04:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:17Z|00336|binding|INFO|Setting lport 539a9707-ef82-4c64-aec4-3759222680f0 up in Southbound
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.218 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 systemd-udevd[222287]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:04:17 np0005546909 NetworkManager[55691]: <info>  [1764936257.2454] device (tap539a9707-ef): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:04:17 np0005546909 NetworkManager[55691]: <info>  [1764936257.2472] device (tap539a9707-ef): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.259 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5c0fcf6a-d5d8-4344-a3c2-6a106e53fa4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.267 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4cdce7-d314-4f7e-825c-6b0a35dd9a6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 NetworkManager[55691]: <info>  [1764936257.2683] manager: (tap5d064000-30): new Veth device (/org/freedesktop/NetworkManager/Devices/142)
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.303 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[caa868d5-a762-42a6-ab02-46622d9447ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.307 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3d09cb8e-71bb-43e1-961d-d26e6e9c18d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 NetworkManager[55691]: <info>  [1764936257.3343] device (tap5d064000-30): carrier: link connected
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.339 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[57d6974c-24f3-489b-ad9a-9bc73ccf709a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.379 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[843fd080-89d6-4c83-8b19-d28c6f276f8a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d064000-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:6d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363789, 'reachable_time': 18329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222336, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 podman[222293]: 2025-12-05 12:04:17.389575062 +0000 UTC m=+0.112654979 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.403 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ee0d5421-3665-4f41-b79b-57db63c20782]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:6d24'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363789, 'tstamp': 363789}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222340, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.420 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e005e595-87b3-48b5-ba9c-c79674090791]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d064000-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:6d:24'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363789, 'reachable_time': 18329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222341, 'error': None, 'target': 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.460 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[411ace59-c02e-4476-8ad2-d5725fb2a839]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.527 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b90da461-ea69-40d3-9a64-63e042ece49f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.528 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d064000-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.529 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.529 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d064000-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.531 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 NetworkManager[55691]: <info>  [1764936257.5319] manager: (tap5d064000-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Dec  5 07:04:17 np0005546909 kernel: tap5d064000-30: entered promiscuous mode
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.536 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.537 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5d064000-30, col_values=(('external_ids', {'iface-id': '1b49f23e-d835-4ef5-82b9-a339d97fd4cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:17Z|00337|binding|INFO|Releasing lport 1b49f23e-d835-4ef5-82b9-a339d97fd4cd from this chassis (sb_readonly=0)
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.549 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.553 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.554 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.555 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[05cdc05b-8e25-40f6-814c-0194ff809ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.556 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-5d064000-316c-46a7-a23c-1dc26318b6a4
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/5d064000-316c-46a7-a23c-1dc26318b6a4.pid.haproxy
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 5d064000-316c-46a7-a23c-1dc26318b6a4
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:04:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:17.556 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4', 'env', 'PROCESS_TAG=haproxy-5d064000-316c-46a7-a23c-1dc26318b6a4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5d064000-316c-46a7-a23c-1dc26318b6a4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.570 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.843 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936257.8427467, a7616662-639b-4642-b507-614773f4748f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.843 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] VM Started (Lifecycle Event)#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.862 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.868 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936257.8429296, a7616662-639b-4642-b507-614773f4748f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.868 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.890 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.893 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.923 187212 DEBUG oslo_concurrency.lockutils [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.923 187212 DEBUG oslo_concurrency.lockutils [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "c2e63727-b45b-4249-a94f-85b0d6314ba0" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.924 187212 DEBUG nova.compute.manager [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.925 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: a7616662-639b-4642-b507-614773f4748f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.928 187212 DEBUG nova.compute.manager [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.929 187212 DEBUG nova.objects.instance [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'flavor' on Instance uuid c2e63727-b45b-4249-a94f-85b0d6314ba0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:04:17 np0005546909 nova_compute[187208]: 2025-12-05 12:04:17.954 187212 DEBUG nova.virt.libvirt.driver [None req-54fbd1f3-0494-4b7e-a1b6-c66695e6eb88 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: c2e63727-b45b-4249-a94f-85b0d6314ba0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:04:17 np0005546909 podman[222383]: 2025-12-05 12:04:17.960629723 +0000 UTC m=+0.054991612 container create 4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:04:18 np0005546909 systemd[1]: Started libpod-conmon-4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc.scope.
Dec  5 07:04:18 np0005546909 podman[222383]: 2025-12-05 12:04:17.93218175 +0000 UTC m=+0.026543659 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:04:18 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:04:18 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95fa845c032e9a9b1ecabac8ab36f6e4863c974067db1dae8ab549ec5cbc0437/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:04:18 np0005546909 podman[222383]: 2025-12-05 12:04:18.052703683 +0000 UTC m=+0.147065602 container init 4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  5 07:04:18 np0005546909 podman[222383]: 2025-12-05 12:04:18.058011994 +0000 UTC m=+0.152373883 container start 4a41576caaa6b4e3476760b77598217b6d4370ebe86c3fc2759adc35f3eceddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:04:18 np0005546909 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[222398]: [NOTICE]   (222402) : New worker (222404) forked
Dec  5 07:04:18 np0005546909 neutron-haproxy-ovnmeta-5d064000-316c-46a7-a23c-1dc26318b6a4[222398]: [NOTICE]   (222402) : Loading success.
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.087 187212 INFO nova.virt.libvirt.driver [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Creating config drive at /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.config#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.094 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3smp6vbu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.227 187212 DEBUG oslo_concurrency.processutils [None req-82aab9b9-9bf1-4e30-86c8-ef1c18c57ec0 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e5212ff3-c6ed-4f02-99c4-becad0e5f2a5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3smp6vbu" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:18 np0005546909 NetworkManager[55691]: <info>  [1764936258.3162] manager: (tapea8794b1-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/144)
Dec  5 07:04:18 np0005546909 systemd-udevd[222315]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:04:18 np0005546909 kernel: tapea8794b1-8d: entered promiscuous mode
Dec  5 07:04:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:18Z|00338|binding|INFO|Claiming lport ea8794b1-8d29-4839-af08-e1675802ea0a for this chassis.
Dec  5 07:04:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:18Z|00339|binding|INFO|ea8794b1-8d29-4839-af08-e1675802ea0a: Claiming fa:16:3e:58:21:a9 10.100.0.3
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.325 187212 DEBUG nova.network.neutron [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Updated VIF entry in instance network info cache for port 539a9707-ef82-4c64-aec4-3759222680f0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.326 187212 DEBUG nova.network.neutron [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: a7616662-639b-4642-b507-614773f4748f] Updating instance_info_cache with network_info: [{"id": "539a9707-ef82-4c64-aec4-3759222680f0", "address": "fa:16:3e:d2:c8:06", "network": {"id": "5d064000-316c-46a7-a23c-1dc26318b6a4", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-983632549-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79895287bd1d488c842f6013729a1f81", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap539a9707-ef", "ovs_interfaceid": "539a9707-ef82-4c64-aec4-3759222680f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.328 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.338 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:21:a9 10.100.0.3'], port_security=['fa:16:3e:58:21:a9 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e5212ff3-c6ed-4f02-99c4-becad0e5f2a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ea8794b1-8d29-4839-af08-e1675802ea0a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.339 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ea8794b1-8d29-4839-af08-e1675802ea0a in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.341 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd#033[00m
Dec  5 07:04:18 np0005546909 NetworkManager[55691]: <info>  [1764936258.3517] device (tapea8794b1-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.352 187212 DEBUG oslo_concurrency.lockutils [req-968450c8-ae2d-4dd7-a9e2-1985a31f3846 req-fdfd1e91-8b59-4387-8688-47d4621a852c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-a7616662-639b-4642-b507-614773f4748f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:04:18 np0005546909 NetworkManager[55691]: <info>  [1764936258.3528] device (tapea8794b1-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.353 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[78b903e9-677f-45ed-9c70-4eea929bae0a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.355 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41b3b495-c1 in ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.357 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41b3b495-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.357 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3a0a84ab-915e-472f-b546-aa2334176199]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.358 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3ddf8e-2d06-4a26-8d85-06e0862e4443]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.370 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[07afddd6-74b9-45c4-83bf-f3ccbb12afc6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 systemd-machined[153543]: New machine qemu-46-instance-00000028.
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.384 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:18Z|00340|binding|INFO|Setting lport ea8794b1-8d29-4839-af08-e1675802ea0a ovn-installed in OVS
Dec  5 07:04:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:18Z|00341|binding|INFO|Setting lport ea8794b1-8d29-4839-af08-e1675802ea0a up in Southbound
Dec  5 07:04:18 np0005546909 systemd[1]: Started Virtual Machine qemu-46-instance-00000028.
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.391 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.395 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[492af5ad-a1f4-41ae-95ef-895f916a1f9d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.424 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f80a38-baa4-419a-9024-dcdc03449369]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 NetworkManager[55691]: <info>  [1764936258.4415] manager: (tap41b3b495-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/145)
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.438 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5ca439e8-46a0-4767-a22a-a6a4750c3e76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.482 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[199fb589-1e01-4848-8a3d-117e3a6d53b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.486 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa200c9-f563-4f7f-9f4b-0c17355f15fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 NetworkManager[55691]: <info>  [1764936258.5147] device (tap41b3b495-c0): carrier: link connected
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.521 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[02f46216-d081-4f39-b282-4c0066a0bf0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.538 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c5dde2a3-ead2-4c10-bf95-218496486dfe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363907, 'reachable_time': 21906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222448, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.552 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d222c94-9013-47f6-857c-7d361542acb0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:a102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 363907, 'tstamp': 363907}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222449, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.573 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[87bb2706-19be-4844-8276-47c15252cffe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 95], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 363907, 'reachable_time': 21906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222450, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.610 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[194482ee-da4e-4c75-9a82-4aa3a5136f59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.685 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf57426-e8ac-4f5c-a7d5-28ae7e9c26ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.687 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.687 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.688 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.690 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:18 np0005546909 NetworkManager[55691]: <info>  [1764936258.6908] manager: (tap41b3b495-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Dec  5 07:04:18 np0005546909 kernel: tap41b3b495-c0: entered promiscuous mode
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.692 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.693 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:04:18Z|00342|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.706 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.709 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.710 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2d667d97-356b-4262-83cb-f85e9cf5fd7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.711 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:04:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:04:18.711 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'env', 'PROCESS_TAG=haproxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41b3b495-c1c9-44c0-b1a3-a499df6548dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.880 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936258.8794317, e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.880 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] VM Started (Lifecycle Event)#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.903 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.907 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936258.8799453, e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.907 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.925 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.929 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:04:18 np0005546909 nova_compute[187208]: 2025-12-05 12:04:18.947 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:04:19 np0005546909 podman[222489]: 2025-12-05 12:04:19.110138817 +0000 UTC m=+0.062321171 container create c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:04:19 np0005546909 systemd[1]: Started libpod-conmon-c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589.scope.
Dec  5 07:04:19 np0005546909 podman[222489]: 2025-12-05 12:04:19.07280381 +0000 UTC m=+0.024986154 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:04:19 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:04:19 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b7c90b66f75d352f77a65a6e2c60491d5ded526586a774fd4cd500b1acf38ae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:04:19 np0005546909 podman[222489]: 2025-12-05 12:04:19.200466087 +0000 UTC m=+0.152648431 container init c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  5 07:04:19 np0005546909 podman[222489]: 2025-12-05 12:04:19.205705967 +0000 UTC m=+0.157888291 container start c4f96acfcec305d90bd6dddb022f55297d16a6985148293718709e6194dd7589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec  5 07:04:19 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[222505]: [NOTICE]   (222509) : New worker (222511) forked
Dec  5 07:04:19 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[222505]: [NOTICE]   (222509) : Loading success.
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.153 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Successfully updated port: d7b765ff-93e1-4594-9e3c-e177dee2e07b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.172 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "refresh_cache-082d2145-1505-4170-9a11-4e46bf86fed2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.172 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquired lock "refresh_cache-082d2145-1505-4170-9a11-4e46bf86fed2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.173 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.178 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Successfully updated port: eabadaa6-16c4-434c-83ea-96dfa62d7f79 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.195 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquiring lock "refresh_cache-58c3288f-57bf-4c62-8d69-9842a22e43d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.195 187212 DEBUG oslo_concurrency.lockutils [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] Acquired lock "refresh_cache-58c3288f-57bf-4c62-8d69-9842a22e43d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.196 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.440 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "004672c5-70e2-4940-bc9c-8971d94cc037" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.441 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "004672c5-70e2-4940-bc9c-8971d94cc037" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.464 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.572 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.573 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.579 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.579 187212 INFO nova.compute.claims [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.623 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 58c3288f-57bf-4c62-8d69-9842a22e43d6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.627 187212 DEBUG nova.network.neutron [None req-2996eae4-a288-474a-b3eb-f4499d6e75b9 40620135b1ff4f8d9d80eb79f51fd593 bebbbd9623064681bb9350747fba600e - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.794 187212 DEBUG nova.compute.provider_tree [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.810 187212 DEBUG nova.scheduler.client.report [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.832 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.832 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.876 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.876 187212 DEBUG nova.network.neutron [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.892 187212 INFO nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:04:20 np0005546909 nova_compute[187208]: 2025-12-05 12:04:20.907 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.009 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.010 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.010 187212 INFO nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Creating image(s)#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.010 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "/var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.011 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "/var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.011 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "/var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.024 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.082 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.084 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.085 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.095 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.157 187212 DEBUG nova.compute.manager [req-5ee59a98-8943-4979-97cd-7a3b4135d857 req-80dda0bd-19ed-41c4-bc66-45bb42269bbf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Received event network-changed-d7b765ff-93e1-4594-9e3c-e177dee2e07b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.158 187212 DEBUG nova.compute.manager [req-5ee59a98-8943-4979-97cd-7a3b4135d857 req-80dda0bd-19ed-41c4-bc66-45bb42269bbf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 082d2145-1505-4170-9a11-4e46bf86fed2] Refreshing instance network info cache due to event network-changed-d7b765ff-93e1-4594-9e3c-e177dee2e07b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.158 187212 DEBUG oslo_concurrency.lockutils [req-5ee59a98-8943-4979-97cd-7a3b4135d857 req-80dda0bd-19ed-41c4-bc66-45bb42269bbf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-082d2145-1505-4170-9a11-4e46bf86fed2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.160 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.160 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.210 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.211 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.212 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.231 187212 DEBUG nova.network.neutron [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Updated VIF entry in instance network info cache for port ea8794b1-8d29-4839-af08-e1675802ea0a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.232 187212 DEBUG nova.network.neutron [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Updating instance_info_cache with network_info: [{"id": "ea8794b1-8d29-4839-af08-e1675802ea0a", "address": "fa:16:3e:58:21:a9", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea8794b1-8d", "ovs_interfaceid": "ea8794b1-8d29-4839-af08-e1675802ea0a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.249 187212 DEBUG oslo_concurrency.lockutils [req-1f0a413b-aac4-49b4-a470-4500d7839360 req-8c1440f5-d7b0-4daa-b9ea-25340a6d76f9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-e5212ff3-c6ed-4f02-99c4-becad0e5f2a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.268 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.269 187212 DEBUG nova.virt.disk.api [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Checking if we can resize image /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.269 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.287 187212 DEBUG nova.network.neutron [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.287 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.324 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.324 187212 DEBUG nova.virt.disk.api [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Cannot resize image /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.325 187212 DEBUG nova.objects.instance [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lazy-loading 'migration_context' on Instance uuid 004672c5-70e2-4940-bc9c-8971d94cc037 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.336 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.337 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Ensure instance console log exists: /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.337 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.338 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.338 187212 DEBUG oslo_concurrency.lockutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.339 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.345 187212 WARNING nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.350 187212 DEBUG nova.virt.libvirt.host [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.351 187212 DEBUG nova.virt.libvirt.host [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.354 187212 DEBUG nova.virt.libvirt.host [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.354 187212 DEBUG nova.virt.libvirt.host [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.355 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.355 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.355 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.356 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.356 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.356 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.356 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.357 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.357 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.357 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.358 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.358 187212 DEBUG nova.virt.hardware [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.362 187212 DEBUG nova.objects.instance [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lazy-loading 'pci_devices' on Instance uuid 004672c5-70e2-4940-bc9c-8971d94cc037 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.375 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  <uuid>004672c5-70e2-4940-bc9c-8971d94cc037</uuid>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  <name>instance-0000002d</name>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <nova:name>tempest-ListImageFiltersTestJSON-server-469388429</nova:name>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:04:21</nova:creationTime>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:04:21 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:        <nova:user uuid="8456efa356654e5c990efa4aef688e8a">tempest-ListImageFiltersTestJSON-277323355-project-member</nova:user>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:        <nova:project uuid="42d9566206cb469ebd803d0600019533">tempest-ListImageFiltersTestJSON-277323355</nova:project>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <entry name="serial">004672c5-70e2-4940-bc9c-8971d94cc037</entry>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <entry name="uuid">004672c5-70e2-4940-bc9c-8971d94cc037</entry>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.config"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/console.log" append="off"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:04:21 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:04:21 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:04:21 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:04:21 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.429 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.431 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.432 187212 INFO nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Using config drive#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.656 187212 INFO nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Creating config drive at /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.config#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.661 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzsr4fpwc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:04:21 np0005546909 nova_compute[187208]: 2025-12-05 12:04:21.791 187212 DEBUG oslo_concurrency.processutils [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzsr4fpwc" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:04:21 np0005546909 systemd-machined[153543]: New machine qemu-47-instance-0000002d.
Dec  5 07:04:21 np0005546909 systemd[1]: Started Virtual Machine qemu-47-instance-0000002d.
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.086 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.212 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936262.2122447, 004672c5-70e2-4940-bc9c-8971d94cc037 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.213 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.217 187212 DEBUG nova.compute.manager [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.217 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.222 187212 INFO nova.virt.libvirt.driver [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Instance spawned successfully.#033[00m
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.222 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.246 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.253 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.257 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.258 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.258 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:04:22 np0005546909 nova_compute[187208]: 2025-12-05 12:04:22.259 187212 DEBUG nova.virt.libvirt.driver [None req-8e8ee723-cc3d-449f-af45-521a90c512e7 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:03.011 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:03.012 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:03.013 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.034 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.034 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.041 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.041 187212 INFO nova.compute.claims [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.119 187212 DEBUG nova.network.neutron [-] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.136 187212 INFO nova.compute.manager [-] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Took 0.84 seconds to deallocate network for instance.#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.191 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.259 187212 DEBUG nova.compute.provider_tree [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.276 187212 DEBUG nova.scheduler.client.report [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:03 np0005546909 rsyslogd[1004]: imjournal: 5109 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.304 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.305 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.307 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.352 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.353 187212 DEBUG nova.network.neutron [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.370 187212 INFO nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.387 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.458 187212 DEBUG nova.compute.provider_tree [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.478 187212 DEBUG nova.scheduler.client.report [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.484 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.485 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.485 187212 INFO nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Creating image(s)#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.486 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "/var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.486 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.487 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "/var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.499 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.502 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.526 187212 INFO nova.scheduler.client.report [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Deleted allocations for instance 00262d23-bf60-44d9-a775-63ba32adaf96#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.569 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.570 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.571 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.582 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.606 187212 DEBUG oslo_concurrency.lockutils [None req-dc72442f-44c1-4247-b24e-4486eb2ef9f0 3ee170bdfdd343189ee1da01bdb80be6 79895287bd1d488c842f6013729a1f81 - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.658 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.659 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.700 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.703 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.703 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.728 187212 DEBUG nova.policy [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a00ac4435e6647779ffaf4a5cde18fdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.767 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.768 187212 DEBUG nova.virt.disk.api [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Checking if we can resize image /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.768 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.790 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "21873f07-a1da-4158-a5b2-1d44d547874e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.791 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.791 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.791 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.792 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.794 187212 INFO nova.compute.manager [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Terminating instance#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.795 187212 DEBUG nova.compute.manager [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.803 187212 INFO nova.virt.libvirt.driver [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Instance destroyed successfully.#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.804 187212 DEBUG nova.objects.instance [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lazy-loading 'resources' on Instance uuid 21873f07-a1da-4158-a5b2-1d44d547874e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.815 187212 DEBUG nova.virt.libvirt.vif [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1892322843',display_name='tempest-DeleteServersTestJSON-server-1892322843',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1892322843',id=51,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:04:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='4671f6c82ea049fab3a314ecf45b7656',ramdisk_id='',reservation_id='r-javbhuzq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-554028480',owner_user_name='tempest-DeleteServersTestJSON-554028480-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:05:00Z,user_data=None,user_id='ff425b7b04144f93a2c15e3a347fc15c',uuid=21873f07-a1da-4158-a5b2-1d44d547874e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "c4a66ea2-9b1b-486a-a750-17072882c42e", "address": "fa:16:3e:b7:70:07", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a66ea2-9b", "ovs_interfaceid": "c4a66ea2-9b1b-486a-a750-17072882c42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.816 187212 DEBUG nova.network.os_vif_util [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converting VIF {"id": "c4a66ea2-9b1b-486a-a750-17072882c42e", "address": "fa:16:3e:b7:70:07", "network": {"id": "d7360f84-bcd5-4e64-bf43-1fdbd8215a70", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-437442147-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4671f6c82ea049fab3a314ecf45b7656", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc4a66ea2-9b", "ovs_interfaceid": "c4a66ea2-9b1b-486a-a750-17072882c42e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.817 187212 DEBUG nova.network.os_vif_util [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:70:07,bridge_name='br-int',has_traffic_filtering=True,id=c4a66ea2-9b1b-486a-a750-17072882c42e,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a66ea2-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.817 187212 DEBUG os_vif [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:70:07,bridge_name='br-int',has_traffic_filtering=True,id=c4a66ea2-9b1b-486a-a750-17072882c42e,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a66ea2-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.819 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.819 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4a66ea2-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.821 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.826 187212 INFO os_vif [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:70:07,bridge_name='br-int',has_traffic_filtering=True,id=c4a66ea2-9b1b-486a-a750-17072882c42e,network=Network(d7360f84-bcd5-4e64-bf43-1fdbd8215a70),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc4a66ea2-9b')#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.826 187212 INFO nova.virt.libvirt.driver [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Deleting instance files /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e_del#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.827 187212 INFO nova.virt.libvirt.driver [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Deletion of /var/lib/nova/instances/21873f07-a1da-4158-a5b2-1d44d547874e_del complete#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.830 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.831 187212 DEBUG nova.virt.disk.api [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Cannot resize image /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.831 187212 DEBUG nova.objects.instance [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'migration_context' on Instance uuid ed7b6780-872e-41ef-a0c7-c48d0d6d13fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.853 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.854 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Ensure instance console log exists: /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.854 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.854 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.854 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.887 187212 INFO nova.compute.manager [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Took 0.09 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.887 187212 DEBUG oslo.service.loopingcall [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.887 187212 DEBUG nova.compute.manager [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:05:03 np0005546909 nova_compute[187208]: 2025-12-05 12:05:03.888 187212 DEBUG nova.network.neutron [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.013 187212 DEBUG nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received event network-vif-unplugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.013 187212 DEBUG oslo_concurrency.lockutils [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.013 187212 DEBUG oslo_concurrency.lockutils [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.013 187212 DEBUG oslo_concurrency.lockutils [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.014 187212 DEBUG nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] No waiting events found dispatching network-vif-unplugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.014 187212 WARNING nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received unexpected event network-vif-unplugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.014 187212 DEBUG nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received event network-vif-plugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.014 187212 DEBUG oslo_concurrency.lockutils [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.014 187212 DEBUG oslo_concurrency.lockutils [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.014 187212 DEBUG oslo_concurrency.lockutils [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "00262d23-bf60-44d9-a775-63ba32adaf96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.015 187212 DEBUG nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] No waiting events found dispatching network-vif-plugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.015 187212 WARNING nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received unexpected event network-vif-plugged-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.015 187212 DEBUG nova.compute.manager [req-83d71b75-7602-4e12-89d2-b3c58a31122b req-f389a451-8484-46b9-a7b7-fde7b1c1b846 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Received event network-vif-deleted-ff2850e9-aaf4-4f4e-a323-24b258a0b4c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.188 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.189 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.189 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.190 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.190 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.191 187212 INFO nova.compute.manager [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Terminating instance#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.191 187212 DEBUG nova.compute.manager [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:05:04 np0005546909 kernel: tapb7157ade-85 (unregistering): left promiscuous mode
Dec  5 07:05:04 np0005546909 podman[224550]: 2025-12-05 12:05:04.214266234 +0000 UTC m=+0.062968361 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 07:05:04 np0005546909 NetworkManager[55691]: <info>  [1764936304.2171] device (tapb7157ade-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:05:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:04Z|00417|binding|INFO|Releasing lport b7157ade-85e0-4802-8d6a-0dfb86921b3a from this chassis (sb_readonly=0)
Dec  5 07:05:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:04Z|00418|binding|INFO|Setting lport b7157ade-85e0-4802-8d6a-0dfb86921b3a down in Southbound
Dec  5 07:05:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:04Z|00419|binding|INFO|Removing iface tapb7157ade-85 ovn-installed in OVS
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.223 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.231 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:04:47 10.100.0.5'], port_security=['fa:16:3e:ea:04:47 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c6a957dd-2181-4e92-9e06-e1a15fe5c307', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5bd204a2-320e-456d-a83a-6e434dec755e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5edf9cff-e405-4455-acf7-3f2f1a382e6f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b7157ade-85e0-4802-8d6a-0dfb86921b3a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.232 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b7157ade-85e0-4802-8d6a-0dfb86921b3a in datapath 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 unbound from our chassis#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.234 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.237 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.251 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d4e05d2-f685-4fa7-b10f-b5cb31094e73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:04 np0005546909 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000030.scope: Deactivated successfully.
Dec  5 07:05:04 np0005546909 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000030.scope: Consumed 12.997s CPU time.
Dec  5 07:05:04 np0005546909 systemd-machined[153543]: Machine qemu-53-instance-00000030 terminated.
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.284 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4bf16a78-bfb7-4d11-8158-fbe2d28fc834]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.287 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[feda4205-7c55-4970-96fc-efa747157ac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.316 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bdfd177f-ecb1-45b8-a560-8c373eded33f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.335 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[32c96826-58f9-4ee2-b8f2-ed61684251de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2dd8ae79-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:db:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365754, 'reachable_time': 25183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224579, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.354 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a23252d7-efd4-4d94-a0b0-efc49bbea9ff]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap2dd8ae79-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365765, 'tstamp': 365765}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224580, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap2dd8ae79-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365769, 'tstamp': 365769}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224580, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.356 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd8ae79-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.357 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.362 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.362 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2dd8ae79-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.362 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.363 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2dd8ae79-a0, col_values=(('external_ids', {'iface-id': '96c6c9a6-c871-4fab-9fdc-eedbdd230979'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:04.363 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.417 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.422 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.469 187212 INFO nova.virt.libvirt.driver [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Instance destroyed successfully.#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.470 187212 DEBUG nova.objects.instance [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lazy-loading 'resources' on Instance uuid c6a957dd-2181-4e92-9e06-e1a15fe5c307 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.485 187212 DEBUG nova.virt.libvirt.vif [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-951078504',display_name='tempest-ListServersNegativeTestJSON-server-951078504-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-951078504-2',id=48,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-12-05T12:04:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='31d3d0a57b064ff6abd01727d4443c0b',ramdisk_id='',reservation_id='r-ljsgjd2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1128597959',owner_user_name='tempest-ListServersNegativeTestJSON-1128597959-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:04:42Z,user_data=None,user_id='d51d545246e0434591329e386f100a7d',uuid=c6a957dd-2181-4e92-9e06-e1a15fe5c307,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "address": "fa:16:3e:ea:04:47", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7157ade-85", "ovs_interfaceid": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.485 187212 DEBUG nova.network.os_vif_util [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converting VIF {"id": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "address": "fa:16:3e:ea:04:47", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7157ade-85", "ovs_interfaceid": "b7157ade-85e0-4802-8d6a-0dfb86921b3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.486 187212 DEBUG nova.network.os_vif_util [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:04:47,bridge_name='br-int',has_traffic_filtering=True,id=b7157ade-85e0-4802-8d6a-0dfb86921b3a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7157ade-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.486 187212 DEBUG os_vif [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:04:47,bridge_name='br-int',has_traffic_filtering=True,id=b7157ade-85e0-4802-8d6a-0dfb86921b3a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7157ade-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.488 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.488 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7157ade-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.490 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.493 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.559 187212 INFO os_vif [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:04:47,bridge_name='br-int',has_traffic_filtering=True,id=b7157ade-85e0-4802-8d6a-0dfb86921b3a,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7157ade-85')#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.560 187212 INFO nova.virt.libvirt.driver [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Deleting instance files /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307_del#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.560 187212 INFO nova.virt.libvirt.driver [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Deletion of /var/lib/nova/instances/c6a957dd-2181-4e92-9e06-e1a15fe5c307_del complete#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.623 187212 INFO nova.compute.manager [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.623 187212 DEBUG oslo.service.loopingcall [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.624 187212 DEBUG nova.compute.manager [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:05:04 np0005546909 nova_compute[187208]: 2025-12-05 12:05:04.624 187212 DEBUG nova.network.neutron [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.078 187212 DEBUG nova.network.neutron [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Successfully created port: db2c3297-b6c8-4933-9328-102d81d6faa3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.125 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.126 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.126 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.127 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.127 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.128 187212 INFO nova.compute.manager [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Terminating instance#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.130 187212 DEBUG nova.compute.manager [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:05:05 np0005546909 kernel: tap83fb1d43-a4 (unregistering): left promiscuous mode
Dec  5 07:05:05 np0005546909 NetworkManager[55691]: <info>  [1764936305.1531] device (tap83fb1d43-a4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:05:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:05Z|00420|binding|INFO|Releasing lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 from this chassis (sb_readonly=0)
Dec  5 07:05:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:05Z|00421|binding|INFO|Setting lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 down in Southbound
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.158 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:05Z|00422|binding|INFO|Removing iface tap83fb1d43-a4 ovn-installed in OVS
Dec  5 07:05:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:05Z|00423|binding|INFO|Releasing lport 96c6c9a6-c871-4fab-9fdc-eedbdd230979 from this chassis (sb_readonly=0)
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.170 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:de:18 10.100.0.14'], port_security=['fa:16:3e:41:de:18 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '97020786-7ba5-4c8b-8a2c-838c0f663bb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5bd204a2-320e-456d-a83a-6e434dec755e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5edf9cff-e405-4455-acf7-3f2f1a382e6f, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=83fb1d43-a495-47f4-ad3a-569fd7c02c76) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.171 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 83fb1d43-a495-47f4-ad3a-569fd7c02c76 in datapath 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 unbound from our chassis#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.173 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.175 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f0bd38aa-6b7a-4055-b3ef-81352fea8033]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.176 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 namespace which is not needed anymore#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.177 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[223479]: [NOTICE]   (223483) : haproxy version is 2.8.14-c23fe91
Dec  5 07:05:05 np0005546909 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[223479]: [NOTICE]   (223483) : path to executable is /usr/sbin/haproxy
Dec  5 07:05:05 np0005546909 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[223479]: [ALERT]    (223483) : Current worker (223485) exited with code 143 (Terminated)
Dec  5 07:05:05 np0005546909 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[223479]: [WARNING]  (223483) : All workers exited. Exiting... (0)
Dec  5 07:05:05 np0005546909 systemd[1]: libpod-787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da.scope: Deactivated successfully.
Dec  5 07:05:05 np0005546909 conmon[223479]: conmon 787283fd7dd42037005d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da.scope/container/memory.events
Dec  5 07:05:05 np0005546909 podman[224620]: 2025-12-05 12:05:05.347812652 +0000 UTC m=+0.072424953 container died 787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:05:05 np0005546909 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000031.scope: Deactivated successfully.
Dec  5 07:05:05 np0005546909 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000031.scope: Consumed 13.619s CPU time.
Dec  5 07:05:05 np0005546909 systemd-machined[153543]: Machine qemu-52-instance-00000031 terminated.
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.379 187212 DEBUG nova.network.neutron [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:05 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da-userdata-shm.mount: Deactivated successfully.
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.381 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 systemd[1]: var-lib-containers-storage-overlay-dc61896ea332a9ba7458de04edb53befc42cb9ab21a97cb2cedca198eb888fcf-merged.mount: Deactivated successfully.
Dec  5 07:05:05 np0005546909 podman[224620]: 2025-12-05 12:05:05.404410399 +0000 UTC m=+0.129022680 container cleanup 787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 07:05:05 np0005546909 systemd[1]: libpod-conmon-787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da.scope: Deactivated successfully.
Dec  5 07:05:05 np0005546909 podman[224647]: 2025-12-05 12:05:05.472548698 +0000 UTC m=+0.046938951 container remove 787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.478 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ec34fe-1b73-415e-b5b9-7c0513be2015]: (4, ('Fri Dec  5 12:05:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 (787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da)\n787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da\nFri Dec  5 12:05:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 (787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da)\n787283fd7dd42037005dee003342400feef1acd805f5cd1c6012d8fe537d03da\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.480 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[45232b24-036c-4720-b01b-69f61edc9558]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.481 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd8ae79-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.482 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 kernel: tap2dd8ae79-a0: left promiscuous mode
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.498 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.501 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d0e141c-1c0b-4ea5-8812-0c670ca75a4c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.516 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[df67ccac-9883-4221-a984-22902db6eda1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.517 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3ed39343-9497-489f-b776-cc900d2c12c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.533 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f6f19f61-848b-4db6-af0d-8b2ef63a86f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365741, 'reachable_time': 18090, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224667, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 systemd[1]: run-netns-ovnmeta\x2d2dd8ae79\x2da0f0\x2d469c\x2d86de\x2da9a5d5b69f75.mount: Deactivated successfully.
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.536 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.537 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c3bd02f1-4c5c-41ea-9b4d-d2f0c618b36c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 kernel: tap83fb1d43-a4: entered promiscuous mode
Dec  5 07:05:05 np0005546909 NetworkManager[55691]: <info>  [1764936305.5499] manager: (tap83fb1d43-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/172)
Dec  5 07:05:05 np0005546909 kernel: tap83fb1d43-a4 (unregistering): left promiscuous mode
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.551 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:05Z|00424|binding|INFO|Claiming lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 for this chassis.
Dec  5 07:05:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:05Z|00425|binding|INFO|83fb1d43-a495-47f4-ad3a-569fd7c02c76: Claiming fa:16:3e:41:de:18 10.100.0.14
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.608 187212 INFO nova.virt.libvirt.driver [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Instance destroyed successfully.#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.608 187212 DEBUG nova.objects.instance [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lazy-loading 'resources' on Instance uuid 97020786-7ba5-4c8b-8a2c-838c0f663bb4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:05Z|00426|if_status|INFO|Dropped 2 log messages in last 12 seconds (most recently, 12 seconds ago) due to excessive rate
Dec  5 07:05:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:05Z|00427|if_status|INFO|Not setting lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 down as sb is readonly
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.611 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.615 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:05Z|00428|binding|INFO|Releasing lport 83fb1d43-a495-47f4-ad3a-569fd7c02c76 from this chassis (sb_readonly=0)
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.630 187212 INFO nova.compute.manager [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Took 1.74 seconds to deallocate network for instance.#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.631 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:de:18 10.100.0.14'], port_security=['fa:16:3e:41:de:18 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '97020786-7ba5-4c8b-8a2c-838c0f663bb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5bd204a2-320e-456d-a83a-6e434dec755e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5edf9cff-e405-4455-acf7-3f2f1a382e6f, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=83fb1d43-a495-47f4-ad3a-569fd7c02c76) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.633 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 83fb1d43-a495-47f4-ad3a-569fd7c02c76 in datapath 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 bound to our chassis#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.635 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.638 187212 DEBUG nova.virt.libvirt.vif [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:04:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-951078504',display_name='tempest-ListServersNegativeTestJSON-server-951078504-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-951078504-3',id=49,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2025-12-05T12:04:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='31d3d0a57b064ff6abd01727d4443c0b',ramdisk_id='',reservation_id='r-ljsgjd2d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1128597959',owner_user_name='tempest-ListServersNegativeTestJSON-1128597959-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:04:42Z,user_data=None,user_id='d51d545246e0434591329e386f100a7d',uuid=97020786-7ba5-4c8b-8a2c-838c0f663bb4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "address": "fa:16:3e:41:de:18", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fb1d43-a4", "ovs_interfaceid": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.639 187212 DEBUG nova.network.os_vif_util [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converting VIF {"id": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "address": "fa:16:3e:41:de:18", "network": {"id": "2dd8ae79-a0f0-469c-86de-a9a5d5b69f75", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1543606428-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31d3d0a57b064ff6abd01727d4443c0b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83fb1d43-a4", "ovs_interfaceid": "83fb1d43-a495-47f4-ad3a-569fd7c02c76", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.639 187212 DEBUG nova.network.os_vif_util [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:41:de:18,bridge_name='br-int',has_traffic_filtering=True,id=83fb1d43-a495-47f4-ad3a-569fd7c02c76,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fb1d43-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.640 187212 DEBUG os_vif [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:de:18,bridge_name='br-int',has_traffic_filtering=True,id=83fb1d43-a495-47f4-ad3a-569fd7c02c76,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fb1d43-a4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.641 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.641 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83fb1d43-a4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.642 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:de:18 10.100.0.14'], port_security=['fa:16:3e:41:de:18 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '97020786-7ba5-4c8b-8a2c-838c0f663bb4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31d3d0a57b064ff6abd01727d4443c0b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5bd204a2-320e-456d-a83a-6e434dec755e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5edf9cff-e405-4455-acf7-3f2f1a382e6f, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=83fb1d43-a495-47f4-ad3a-569fd7c02c76) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.644 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.646 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.647 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[90205ae5-2662-42fc-b20b-f300ab68e75f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.648 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2dd8ae79-a1 in ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.648 187212 INFO os_vif [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:de:18,bridge_name='br-int',has_traffic_filtering=True,id=83fb1d43-a495-47f4-ad3a-569fd7c02c76,network=Network(2dd8ae79-a0f0-469c-86de-a9a5d5b69f75),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83fb1d43-a4')#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.648 187212 INFO nova.virt.libvirt.driver [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Deleting instance files /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4_del#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.649 187212 INFO nova.virt.libvirt.driver [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Deletion of /var/lib/nova/instances/97020786-7ba5-4c8b-8a2c-838c0f663bb4_del complete#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.650 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2dd8ae79-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.650 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8361833-bb3e-4194-b1e9-d407e13f7bd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.651 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[638471e4-fea4-4787-95ab-67363fbba53d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.664 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[d5eebad3-d02d-4470-93bf-0c9c3ad8b69a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.690 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72b22825-6350-42a9-af2e-26f5e545642a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.729 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f81e0bc4-8f62-4415-9d76-09d1daff633e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.737 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5f316c24-d920-40b5-804d-f9a81450050a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 NetworkManager[55691]: <info>  [1764936305.7381] manager: (tap2dd8ae79-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/173)
Dec  5 07:05:05 np0005546909 systemd-udevd[224611]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.741 187212 INFO nova.compute.manager [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Took 0.61 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.741 187212 DEBUG oslo.service.loopingcall [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.742 187212 DEBUG nova.compute.manager [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.742 187212 DEBUG nova.network.neutron [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.769 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6715e4db-1708-465e-9b61-1ef590526be0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.773 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b25083d7-65b4-432c-b804-d3c321eb2d62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 NetworkManager[55691]: <info>  [1764936305.7998] device (tap2dd8ae79-a0): carrier: link connected
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.807 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9f56a63f-2003-4542-84fa-21964003bdd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.827 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2c2e72bf-450b-48da-a3c8-d78cd91a81b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2dd8ae79-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:db:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368636, 'reachable_time': 36434, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224712, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.843 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e17db419-cc4e-4f16-a726-fd31eb54e938]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:dbed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 368636, 'tstamp': 368636}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224713, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.861 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6ced642e-c5e8-4979-97bb-bf4f397b7532]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2dd8ae79-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:db:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 119], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368636, 'reachable_time': 36434, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224714, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.889 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[20d79d5e-ac85-4228-b085-c10efc036462]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.912 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.912 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.950 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[758e17a1-1ae3-439d-ac19-4162485db29d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.952 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd8ae79-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.952 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.952 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2dd8ae79-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.954 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 NetworkManager[55691]: <info>  [1764936305.9549] manager: (tap2dd8ae79-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Dec  5 07:05:05 np0005546909 kernel: tap2dd8ae79-a0: entered promiscuous mode
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.957 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.958 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2dd8ae79-a0, col_values=(('external_ids', {'iface-id': '96c6c9a6-c871-4fab-9fdc-eedbdd230979'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.959 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:05Z|00429|binding|INFO|Releasing lport 96c6c9a6-c871-4fab-9fdc-eedbdd230979 from this chassis (sb_readonly=0)
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.967 187212 DEBUG nova.compute.manager [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received event network-vif-unplugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.967 187212 DEBUG oslo_concurrency.lockutils [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.968 187212 DEBUG oslo_concurrency.lockutils [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.968 187212 DEBUG oslo_concurrency.lockutils [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.968 187212 DEBUG nova.compute.manager [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] No waiting events found dispatching network-vif-unplugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.968 187212 DEBUG nova.compute.manager [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received event network-vif-unplugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.968 187212 DEBUG nova.compute.manager [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received event network-vif-plugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.969 187212 DEBUG oslo_concurrency.lockutils [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.969 187212 DEBUG oslo_concurrency.lockutils [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.969 187212 DEBUG oslo_concurrency.lockutils [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.969 187212 DEBUG nova.compute.manager [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] No waiting events found dispatching network-vif-plugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.969 187212 WARNING nova.compute.manager [req-3070b206-ea3f-4861-8f9a-b43cb1bd0d2b req-46eb31f9-cb5e-4391-b62f-596cca8f8306 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received unexpected event network-vif-plugged-b7157ade-85e0-4802-8d6a-0dfb86921b3a for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.971 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.972 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2dd8ae79-a0f0-469c-86de-a9a5d5b69f75.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2dd8ae79-a0f0-469c-86de-a9a5d5b69f75.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.973 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e00b754e-8e98-4ad7-82a4-3c83203a9a0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.973 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/2dd8ae79-a0f0-469c-86de-a9a5d5b69f75.pid.haproxy
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:05:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:05.974 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'env', 'PROCESS_TAG=haproxy-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2dd8ae79-a0f0-469c-86de-a9a5d5b69f75.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:05:05 np0005546909 nova_compute[187208]: 2025-12-05 12:05:05.975 187212 DEBUG nova.network.neutron [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.020 187212 INFO nova.compute.manager [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Took 1.40 seconds to deallocate network for instance.#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.045 187212 DEBUG nova.compute.provider_tree [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.066 187212 DEBUG nova.scheduler.client.report [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.106 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.107 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.109 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.148 187212 INFO nova.scheduler.client.report [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Deleted allocations for instance 21873f07-a1da-4158-a5b2-1d44d547874e#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.242 187212 DEBUG nova.compute.provider_tree [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.271 187212 DEBUG nova.scheduler.client.report [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.296 187212 DEBUG oslo_concurrency.lockutils [None req-444d627b-26fe-4361-9f5c-7eff83dbdc66 ff425b7b04144f93a2c15e3a347fc15c 4671f6c82ea049fab3a314ecf45b7656 - - default default] Lock "21873f07-a1da-4158-a5b2-1d44d547874e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.505s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:06 np0005546909 podman[224745]: 2025-12-05 12:05:06.33599251 +0000 UTC m=+0.049860754 container create b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.372 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:06 np0005546909 systemd[1]: Started libpod-conmon-b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c.scope.
Dec  5 07:05:06 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:05:06 np0005546909 podman[224745]: 2025-12-05 12:05:06.310124727 +0000 UTC m=+0.023992991 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:05:06 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dceecec4caf6be812f54c3c703b049ec952977d6772c98d729786d6ad75fed0e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.410 187212 INFO nova.scheduler.client.report [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Deleted allocations for instance c6a957dd-2181-4e92-9e06-e1a15fe5c307#033[00m
Dec  5 07:05:06 np0005546909 podman[224745]: 2025-12-05 12:05:06.42399177 +0000 UTC m=+0.137860034 container init b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 07:05:06 np0005546909 podman[224745]: 2025-12-05 12:05:06.429842898 +0000 UTC m=+0.143711142 container start b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:05:06 np0005546909 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [NOTICE]   (224764) : New worker (224766) forked
Dec  5 07:05:06 np0005546909 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [NOTICE]   (224764) : Loading success.
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.486 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 83fb1d43-a495-47f4-ad3a-569fd7c02c76 in datapath 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 unbound from our chassis#033[00m
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.488 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.489 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[817cd543-33c9-4262-8ff4-5cee08905a94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.489 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 namespace which is not needed anymore#033[00m
Dec  5 07:05:06 np0005546909 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [NOTICE]   (224764) : haproxy version is 2.8.14-c23fe91
Dec  5 07:05:06 np0005546909 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [NOTICE]   (224764) : path to executable is /usr/sbin/haproxy
Dec  5 07:05:06 np0005546909 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [WARNING]  (224764) : Exiting Master process...
Dec  5 07:05:06 np0005546909 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [WARNING]  (224764) : Exiting Master process...
Dec  5 07:05:06 np0005546909 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [ALERT]    (224764) : Current worker (224766) exited with code 143 (Terminated)
Dec  5 07:05:06 np0005546909 neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75[224760]: [WARNING]  (224764) : All workers exited. Exiting... (0)
Dec  5 07:05:06 np0005546909 systemd[1]: libpod-b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c.scope: Deactivated successfully.
Dec  5 07:05:06 np0005546909 podman[224791]: 2025-12-05 12:05:06.616289148 +0000 UTC m=+0.042851082 container died b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 07:05:06 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c-userdata-shm.mount: Deactivated successfully.
Dec  5 07:05:06 np0005546909 systemd[1]: var-lib-containers-storage-overlay-dceecec4caf6be812f54c3c703b049ec952977d6772c98d729786d6ad75fed0e-merged.mount: Deactivated successfully.
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.644 187212 DEBUG oslo_concurrency.lockutils [None req-f259d1a9-7af9-4f2c-9a50-a706043cff10 d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "c6a957dd-2181-4e92-9e06-e1a15fe5c307" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.646 187212 DEBUG nova.network.neutron [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:06 np0005546909 podman[224791]: 2025-12-05 12:05:06.649245536 +0000 UTC m=+0.075807460 container cleanup b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.664 187212 DEBUG nova.network.neutron [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Successfully updated port: db2c3297-b6c8-4933-9328-102d81d6faa3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:05:06 np0005546909 systemd[1]: libpod-conmon-b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c.scope: Deactivated successfully.
Dec  5 07:05:06 np0005546909 podman[224821]: 2025-12-05 12:05:06.724048026 +0000 UTC m=+0.054487477 container remove b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.729 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eb7a28d7-ba7b-4e73-9bf4-b9f4236a9663]: (4, ('Fri Dec  5 12:05:06 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 (b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c)\nb8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c\nFri Dec  5 12:05:06 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 (b8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c)\nb8e3b555aa231bbb760506b479722e1e14523d538155c4765daee171e82f505c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.730 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e6a78010-feaa-4d40-80f0-0bff5dcd08bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.731 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2dd8ae79-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.733 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:06 np0005546909 kernel: tap2dd8ae79-a0: left promiscuous mode
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.745 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.747 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b706710b-e16a-462b-867d-9c18024190b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.763 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[611f9dfa-0137-41d5-a522-cf3c59a637c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.764 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[145761e0-a244-4c26-acfd-3368ffb8fe1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.780 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6085efae-e50f-418b-82de-bac89391b6d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368628, 'reachable_time': 29011, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224837, 'error': None, 'target': 'ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:06 np0005546909 systemd[1]: run-netns-ovnmeta\x2d2dd8ae79\x2da0f0\x2d469c\x2d86de\x2da9a5d5b69f75.mount: Deactivated successfully.
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.782 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2dd8ae79-a0f0-469c-86de-a9a5d5b69f75 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:05:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:06.782 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[0b0f90b6-fdcc-490b-89ab-89e6dee24ad7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.974 187212 INFO nova.compute.manager [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Took 1.23 seconds to deallocate network for instance.#033[00m
Dec  5 07:05:06 np0005546909 nova_compute[187208]: 2025-12-05 12:05:06.980 187212 DEBUG nova.compute.manager [req-4f0cf994-749d-46ea-9cc9-6c6b66f13b9b req-5197b3f9-2d88-4b36-85dc-43c6e7ad9a26 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Received event network-vif-deleted-c4a66ea2-9b1b-486a-a750-17072882c42e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:07 np0005546909 nova_compute[187208]: 2025-12-05 12:05:07.027 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "refresh_cache-ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:07 np0005546909 nova_compute[187208]: 2025-12-05 12:05:07.027 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquired lock "refresh_cache-ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:07 np0005546909 nova_compute[187208]: 2025-12-05 12:05:07.027 187212 DEBUG nova.network.neutron [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:05:07 np0005546909 nova_compute[187208]: 2025-12-05 12:05:07.084 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:07 np0005546909 nova_compute[187208]: 2025-12-05 12:05:07.085 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:07 np0005546909 nova_compute[187208]: 2025-12-05 12:05:07.167 187212 DEBUG nova.compute.provider_tree [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:07 np0005546909 nova_compute[187208]: 2025-12-05 12:05:07.183 187212 DEBUG nova.scheduler.client.report [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:07 np0005546909 nova_compute[187208]: 2025-12-05 12:05:07.209 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:07 np0005546909 nova_compute[187208]: 2025-12-05 12:05:07.233 187212 INFO nova.scheduler.client.report [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Deleted allocations for instance 97020786-7ba5-4c8b-8a2c-838c0f663bb4#033[00m
Dec  5 07:05:07 np0005546909 nova_compute[187208]: 2025-12-05 12:05:07.311 187212 DEBUG oslo_concurrency.lockutils [None req-f409204f-bcb0-47c3-b54d-4533585c401d d51d545246e0434591329e386f100a7d 31d3d0a57b064ff6abd01727d4443c0b - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:07 np0005546909 nova_compute[187208]: 2025-12-05 12:05:07.342 187212 DEBUG nova.network.neutron [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:05:07 np0005546909 nova_compute[187208]: 2025-12-05 12:05:07.593 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.486 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936293.4859166, 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.487 187212 INFO nova.compute.manager [-] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.511 187212 DEBUG nova.compute.manager [None req-d92b68d0-a353-4b2c-9a71-1a2f0f28f0f7 - - - - - -] [instance: 8e6b9036-30c2-4ecf-bd1e-ab6a88cc74c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.513 187212 DEBUG nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Received event network-vif-deleted-b7157ade-85e0-4802-8d6a-0dfb86921b3a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.514 187212 DEBUG nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received event network-vif-unplugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.514 187212 DEBUG oslo_concurrency.lockutils [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.514 187212 DEBUG oslo_concurrency.lockutils [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.514 187212 DEBUG oslo_concurrency.lockutils [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.514 187212 DEBUG nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] No waiting events found dispatching network-vif-unplugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.515 187212 WARNING nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received unexpected event network-vif-unplugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.515 187212 DEBUG nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received event network-vif-plugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.515 187212 DEBUG oslo_concurrency.lockutils [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.515 187212 DEBUG oslo_concurrency.lockutils [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.515 187212 DEBUG oslo_concurrency.lockutils [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "97020786-7ba5-4c8b-8a2c-838c0f663bb4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.516 187212 DEBUG nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] No waiting events found dispatching network-vif-plugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.516 187212 WARNING nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received unexpected event network-vif-plugged-83fb1d43-a495-47f4-ad3a-569fd7c02c76 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:05:08 np0005546909 nova_compute[187208]: 2025-12-05 12:05:08.516 187212 DEBUG nova.compute.manager [req-59f3a4c4-31e0-4a82-80b0-72f2ae04f60d req-a3e0a0eb-14c5-436f-9827-7b017952ffad 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Received event network-vif-deleted-83fb1d43-a495-47f4-ad3a-569fd7c02c76 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.110 187212 DEBUG nova.compute.manager [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Received event network-changed-db2c3297-b6c8-4933-9328-102d81d6faa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.110 187212 DEBUG nova.compute.manager [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Refreshing instance network info cache due to event network-changed-db2c3297-b6c8-4933-9328-102d81d6faa3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.110 187212 DEBUG oslo_concurrency.lockutils [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.457 187212 DEBUG nova.network.neutron [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Updating instance_info_cache with network_info: [{"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.474 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Releasing lock "refresh_cache-ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.474 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Instance network_info: |[{"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.474 187212 DEBUG oslo_concurrency.lockutils [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.475 187212 DEBUG nova.network.neutron [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Refreshing network info cache for port db2c3297-b6c8-4933-9328-102d81d6faa3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.477 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Start _get_guest_xml network_info=[{"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.482 187212 WARNING nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.489 187212 DEBUG nova.virt.libvirt.host [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.490 187212 DEBUG nova.virt.libvirt.host [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.501 187212 DEBUG nova.virt.libvirt.host [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.503 187212 DEBUG nova.virt.libvirt.host [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.503 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.504 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.504 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.504 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.505 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.505 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.505 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.505 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.506 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.506 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.506 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.506 187212 DEBUG nova.virt.hardware [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.509 187212 DEBUG nova.virt.libvirt.vif [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-465631494',display_name='tempest-ImagesTestJSON-server-465631494',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-465631494',id=53,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-j61263vr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:03Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=ed7b6780-872e-41ef-a0c7-c48d0d6d13fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.510 187212 DEBUG nova.network.os_vif_util [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.510 187212 DEBUG nova.network.os_vif_util [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.511 187212 DEBUG nova.objects.instance [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'pci_devices' on Instance uuid ed7b6780-872e-41ef-a0c7-c48d0d6d13fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.528 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  <uuid>ed7b6780-872e-41ef-a0c7-c48d0d6d13fd</uuid>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  <name>instance-00000035</name>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <nova:name>tempest-ImagesTestJSON-server-465631494</nova:name>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:05:09</nova:creationTime>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:        <nova:user uuid="a00ac4435e6647779ffaf4a5cde18fdb">tempest-ImagesTestJSON-276789408-project-member</nova:user>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:        <nova:project uuid="43e63f5c6b0f4840ad4df23fb5c10764">tempest-ImagesTestJSON-276789408</nova:project>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:        <nova:port uuid="db2c3297-b6c8-4933-9328-102d81d6faa3">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <entry name="serial">ed7b6780-872e-41ef-a0c7-c48d0d6d13fd</entry>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <entry name="uuid">ed7b6780-872e-41ef-a0c7-c48d0d6d13fd</entry>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.config"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:66:5d:24"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <target dev="tapdb2c3297-b6"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/console.log" append="off"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:05:09 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:05:09 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:05:09 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:05:09 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.529 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Preparing to wait for external event network-vif-plugged-db2c3297-b6c8-4933-9328-102d81d6faa3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.529 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.529 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.529 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.530 187212 DEBUG nova.virt.libvirt.vif [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-465631494',display_name='tempest-ImagesTestJSON-server-465631494',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-465631494',id=53,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-j61263vr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:03Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=ed7b6780-872e-41ef-a0c7-c48d0d6d13fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.530 187212 DEBUG nova.network.os_vif_util [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.531 187212 DEBUG nova.network.os_vif_util [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.531 187212 DEBUG os_vif [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.531 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.532 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.532 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.534 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.534 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb2c3297-b6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.534 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdb2c3297-b6, col_values=(('external_ids', {'iface-id': 'db2c3297-b6c8-4933-9328-102d81d6faa3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:5d:24', 'vm-uuid': 'ed7b6780-872e-41ef-a0c7-c48d0d6d13fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.536 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:09 np0005546909 NetworkManager[55691]: <info>  [1764936309.5372] manager: (tapdb2c3297-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.543 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.544 187212 INFO os_vif [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6')#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.621 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.622 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.622 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] No VIF found with MAC fa:16:3e:66:5d:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:05:09 np0005546909 nova_compute[187208]: 2025-12-05 12:05:09.622 187212 INFO nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Using config drive#033[00m
Dec  5 07:05:10 np0005546909 nova_compute[187208]: 2025-12-05 12:05:10.031 187212 INFO nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Creating config drive at /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.config#033[00m
Dec  5 07:05:10 np0005546909 nova_compute[187208]: 2025-12-05 12:05:10.040 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8pwfs35a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:10 np0005546909 nova_compute[187208]: 2025-12-05 12:05:10.185 187212 DEBUG oslo_concurrency.processutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8pwfs35a" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:10 np0005546909 kernel: tapdb2c3297-b6: entered promiscuous mode
Dec  5 07:05:10 np0005546909 NetworkManager[55691]: <info>  [1764936310.2767] manager: (tapdb2c3297-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/176)
Dec  5 07:05:10 np0005546909 nova_compute[187208]: 2025-12-05 12:05:10.277 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:10Z|00430|binding|INFO|Claiming lport db2c3297-b6c8-4933-9328-102d81d6faa3 for this chassis.
Dec  5 07:05:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:10Z|00431|binding|INFO|db2c3297-b6c8-4933-9328-102d81d6faa3: Claiming fa:16:3e:66:5d:24 10.100.0.5
Dec  5 07:05:10 np0005546909 nova_compute[187208]: 2025-12-05 12:05:10.282 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:10 np0005546909 systemd-udevd[224864]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:05:10 np0005546909 systemd-machined[153543]: New machine qemu-57-instance-00000035.
Dec  5 07:05:10 np0005546909 NetworkManager[55691]: <info>  [1764936310.3294] device (tapdb2c3297-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:05:10 np0005546909 NetworkManager[55691]: <info>  [1764936310.3309] device (tapdb2c3297-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:05:10 np0005546909 nova_compute[187208]: 2025-12-05 12:05:10.332 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:10Z|00432|binding|INFO|Setting lport db2c3297-b6c8-4933-9328-102d81d6faa3 ovn-installed in OVS
Dec  5 07:05:10 np0005546909 nova_compute[187208]: 2025-12-05 12:05:10.336 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:10 np0005546909 systemd[1]: Started Virtual Machine qemu-57-instance-00000035.
Dec  5 07:05:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:10Z|00433|binding|INFO|Setting lport db2c3297-b6c8-4933-9328-102d81d6faa3 up in Southbound
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.473 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:5d:24 10.100.0.5'], port_security=['fa:16:3e:66:5d:24 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ed7b6780-872e-41ef-a0c7-c48d0d6d13fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=db2c3297-b6c8-4933-9328-102d81d6faa3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.475 104471 INFO neutron.agent.ovn.metadata.agent [-] Port db2c3297-b6c8-4933-9328-102d81d6faa3 in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd bound to our chassis#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.476 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.489 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a12a5d3f-45e5-4252-be33-f33ad9c3c29c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.490 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap41b3b495-c1 in ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.495 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap41b3b495-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.495 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4a1a3c63-a997-4087-8946-64bda44ea9b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.496 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9d969bfc-46b2-4a31-8e9a-65a1cf853837]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.509 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[dc122b3b-c566-4f08-9833-6580cb025b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.527 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e268cdeb-c045-4c1b-b587-3dcca5aa5bcf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.564 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[16b2fce8-550f-439f-b504-ff77f1d520c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 NetworkManager[55691]: <info>  [1764936310.5787] manager: (tap41b3b495-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/177)
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.578 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7fea2d0c-8402-43db-9866-fa4622cdb27c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 systemd-udevd[224867]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.620 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e30562fb-e075-442f-b018-934761928dce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.623 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1efed765-a75f-4a95-8496-75c499cc4198]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 NetworkManager[55691]: <info>  [1764936310.6451] device (tap41b3b495-c0): carrier: link connected
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.651 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f8f47435-81dd-488f-bdab-2444d0d16bce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.671 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e83e9fa8-4d7e-4652-9493-7cf05e928c5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369120, 'reachable_time': 42678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224901, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.692 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0aebad91-6948-414b-8019-7a2e9480f365]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2a:a102'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369120, 'tstamp': 369120}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224902, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.715 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b61cda51-83dd-433c-9a04-a783d847d271]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap41b3b495-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2a:a1:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369120, 'reachable_time': 42678, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224903, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.748 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[64271236-41d0-4832-8220-12b92e43e257]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.805 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5b6db54a-2b69-43f6-a408-bda1a706ce93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.808 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.808 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.808 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b3b495-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:10 np0005546909 kernel: tap41b3b495-c0: entered promiscuous mode
Dec  5 07:05:10 np0005546909 nova_compute[187208]: 2025-12-05 12:05:10.810 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:10 np0005546909 NetworkManager[55691]: <info>  [1764936310.8129] manager: (tap41b3b495-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.813 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap41b3b495-c0, col_values=(('external_ids', {'iface-id': 'c6869fa0-977a-4f62-90c1-e160e2bd6f9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:10Z|00434|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.816 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.820 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[229d862d-be6f-4cb6-b6a8-82e2be878f76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.822 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/41b3b495-c1c9-44c0-b1a3-a499df6548dd.pid.haproxy
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 41b3b495-c1c9-44c0-b1a3-a499df6548dd
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:05:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:10.822 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'env', 'PROCESS_TAG=haproxy-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/41b3b495-c1c9-44c0-b1a3-a499df6548dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:05:10 np0005546909 nova_compute[187208]: 2025-12-05 12:05:10.829 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:11 np0005546909 nova_compute[187208]: 2025-12-05 12:05:11.027 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936311.0264604, ed7b6780-872e-41ef-a0c7-c48d0d6d13fd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:11 np0005546909 nova_compute[187208]: 2025-12-05 12:05:11.029 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] VM Started (Lifecycle Event)#033[00m
Dec  5 07:05:11 np0005546909 podman[224943]: 2025-12-05 12:05:11.26848067 +0000 UTC m=+0.050141462 container create d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:05:11 np0005546909 podman[224935]: 2025-12-05 12:05:11.269799068 +0000 UTC m=+0.055480596 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:05:11 np0005546909 systemd[1]: Started libpod-conmon-d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc.scope.
Dec  5 07:05:11 np0005546909 podman[224943]: 2025-12-05 12:05:11.241489514 +0000 UTC m=+0.023150336 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:05:11 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:05:11 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3400c5b4535476a6d33ef667b47e2b8030edbf230d73171381716b0df7dfcda6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:05:11 np0005546909 podman[224943]: 2025-12-05 12:05:11.379948235 +0000 UTC m=+0.161609117 container init d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  5 07:05:11 np0005546909 podman[224943]: 2025-12-05 12:05:11.386219015 +0000 UTC m=+0.167879827 container start d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  5 07:05:11 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [NOTICE]   (224986) : New worker (224988) forked
Dec  5 07:05:11 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [NOTICE]   (224986) : Loading success.
Dec  5 07:05:11 np0005546909 nova_compute[187208]: 2025-12-05 12:05:11.469 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:11 np0005546909 nova_compute[187208]: 2025-12-05 12:05:11.475 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936311.0283916, ed7b6780-872e-41ef-a0c7-c48d0d6d13fd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:11 np0005546909 nova_compute[187208]: 2025-12-05 12:05:11.476 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:05:11 np0005546909 nova_compute[187208]: 2025-12-05 12:05:11.501 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:11 np0005546909 nova_compute[187208]: 2025-12-05 12:05:11.505 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:05:11 np0005546909 nova_compute[187208]: 2025-12-05 12:05:11.527 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.594 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.830 187212 DEBUG nova.compute.manager [req-0c840bb8-d919-4eca-b2bc-4edc2193dfe6 req-1daaa134-b6bc-4e23-8a1d-3ba085c6cbd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Received event network-vif-plugged-db2c3297-b6c8-4933-9328-102d81d6faa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.830 187212 DEBUG oslo_concurrency.lockutils [req-0c840bb8-d919-4eca-b2bc-4edc2193dfe6 req-1daaa134-b6bc-4e23-8a1d-3ba085c6cbd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.830 187212 DEBUG oslo_concurrency.lockutils [req-0c840bb8-d919-4eca-b2bc-4edc2193dfe6 req-1daaa134-b6bc-4e23-8a1d-3ba085c6cbd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.831 187212 DEBUG oslo_concurrency.lockutils [req-0c840bb8-d919-4eca-b2bc-4edc2193dfe6 req-1daaa134-b6bc-4e23-8a1d-3ba085c6cbd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.831 187212 DEBUG nova.compute.manager [req-0c840bb8-d919-4eca-b2bc-4edc2193dfe6 req-1daaa134-b6bc-4e23-8a1d-3ba085c6cbd4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Processing event network-vif-plugged-db2c3297-b6c8-4933-9328-102d81d6faa3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.831 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.835 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936312.8350568, ed7b6780-872e-41ef-a0c7-c48d0d6d13fd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.835 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.837 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.839 187212 INFO nova.virt.libvirt.driver [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Instance spawned successfully.#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.839 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.869 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.875 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.878 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.878 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.879 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.879 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.879 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.880 187212 DEBUG nova.virt.libvirt.driver [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.915 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.962 187212 INFO nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Took 9.48 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:05:12 np0005546909 nova_compute[187208]: 2025-12-05 12:05:12.962 187212 DEBUG nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:13 np0005546909 nova_compute[187208]: 2025-12-05 12:05:13.053 187212 INFO nova.compute.manager [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Took 10.04 seconds to build instance.#033[00m
Dec  5 07:05:13 np0005546909 nova_compute[187208]: 2025-12-05 12:05:13.077 187212 DEBUG oslo_concurrency.lockutils [None req-3388101f-f154-4c98-9165-ab8730448f1c a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:13 np0005546909 nova_compute[187208]: 2025-12-05 12:05:13.635 187212 DEBUG nova.network.neutron [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Updated VIF entry in instance network info cache for port db2c3297-b6c8-4933-9328-102d81d6faa3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:05:13 np0005546909 nova_compute[187208]: 2025-12-05 12:05:13.636 187212 DEBUG nova.network.neutron [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Updating instance_info_cache with network_info: [{"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:13 np0005546909 nova_compute[187208]: 2025-12-05 12:05:13.653 187212 DEBUG oslo_concurrency.lockutils [req-297119aa-4bca-447c-9eff-be25718b61fc req-2cc1c57d-4369-4c22-9bbd-6f12a9b1f964 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:14 np0005546909 nova_compute[187208]: 2025-12-05 12:05:14.275 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936299.274574, e5212ff3-c6ed-4f02-99c4-becad0e5f2a5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:14 np0005546909 nova_compute[187208]: 2025-12-05 12:05:14.275 187212 INFO nova.compute.manager [-] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:05:14 np0005546909 nova_compute[187208]: 2025-12-05 12:05:14.300 187212 DEBUG nova.compute.manager [None req-d092ffa6-6746-410d-8a42-528440b51758 - - - - - -] [instance: e5212ff3-c6ed-4f02-99c4-becad0e5f2a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:14 np0005546909 nova_compute[187208]: 2025-12-05 12:05:14.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:14Z|00435|binding|INFO|Releasing lport c6869fa0-977a-4f62-90c1-e160e2bd6f9f from this chassis (sb_readonly=0)
Dec  5 07:05:14 np0005546909 nova_compute[187208]: 2025-12-05 12:05:14.955 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.002 187212 DEBUG nova.compute.manager [req-5d6204f7-3aaf-4b39-ad47-5bf943c4f735 req-7c55336d-521a-43de-831b-c2f671a966d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Received event network-vif-plugged-db2c3297-b6c8-4933-9328-102d81d6faa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.003 187212 DEBUG oslo_concurrency.lockutils [req-5d6204f7-3aaf-4b39-ad47-5bf943c4f735 req-7c55336d-521a-43de-831b-c2f671a966d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.003 187212 DEBUG oslo_concurrency.lockutils [req-5d6204f7-3aaf-4b39-ad47-5bf943c4f735 req-7c55336d-521a-43de-831b-c2f671a966d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.004 187212 DEBUG oslo_concurrency.lockutils [req-5d6204f7-3aaf-4b39-ad47-5bf943c4f735 req-7c55336d-521a-43de-831b-c2f671a966d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.004 187212 DEBUG nova.compute.manager [req-5d6204f7-3aaf-4b39-ad47-5bf943c4f735 req-7c55336d-521a-43de-831b-c2f671a966d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] No waiting events found dispatching network-vif-plugged-db2c3297-b6c8-4933-9328-102d81d6faa3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.005 187212 WARNING nova.compute.manager [req-5d6204f7-3aaf-4b39-ad47-5bf943c4f735 req-7c55336d-521a-43de-831b-c2f671a966d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Received unexpected event network-vif-plugged-db2c3297-b6c8-4933-9328-102d81d6faa3 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.096 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936300.0949879, 21873f07-a1da-4158-a5b2-1d44d547874e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.097 187212 INFO nova.compute.manager [-] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.143 187212 DEBUG nova.compute.manager [None req-7d932572-a523-4c35-bd22-7a9240366515 - - - - - -] [instance: 21873f07-a1da-4158-a5b2-1d44d547874e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.206 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "f50947f2-f8d0-4d6b-bca4-b5412a206503" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.206 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "f50947f2-f8d0-4d6b-bca4-b5412a206503" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.207 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "f50947f2-f8d0-4d6b-bca4-b5412a206503-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.207 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "f50947f2-f8d0-4d6b-bca4-b5412a206503-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.207 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "f50947f2-f8d0-4d6b-bca4-b5412a206503-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.208 187212 INFO nova.compute.manager [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Terminating instance#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.209 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "refresh_cache-f50947f2-f8d0-4d6b-bca4-b5412a206503" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.209 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquired lock "refresh_cache-f50947f2-f8d0-4d6b-bca4-b5412a206503" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.209 187212 DEBUG nova.network.neutron [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.449 187212 DEBUG nova.network.neutron [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.808 187212 DEBUG nova.compute.manager [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.851 187212 INFO nova.compute.manager [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] instance snapshotting#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.897 187212 DEBUG nova.network.neutron [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.913 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Releasing lock "refresh_cache-f50947f2-f8d0-4d6b-bca4-b5412a206503" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:15 np0005546909 nova_compute[187208]: 2025-12-05 12:05:15.913 187212 DEBUG nova.compute.manager [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:05:15 np0005546909 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Dec  5 07:05:15 np0005546909 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d0000002e.scope: Consumed 15.377s CPU time.
Dec  5 07:05:15 np0005546909 systemd-machined[153543]: Machine qemu-50-instance-0000002e terminated.
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.082 187212 INFO nova.virt.libvirt.driver [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Beginning live snapshot process#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.169 187212 INFO nova.virt.libvirt.driver [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Instance destroyed successfully.#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.170 187212 DEBUG nova.objects.instance [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lazy-loading 'resources' on Instance uuid f50947f2-f8d0-4d6b-bca4-b5412a206503 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.215 187212 INFO nova.virt.libvirt.driver [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Deleting instance files /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503_del#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.216 187212 INFO nova.virt.libvirt.driver [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Deletion of /var/lib/nova/instances/f50947f2-f8d0-4d6b-bca4-b5412a206503_del complete#033[00m
Dec  5 07:05:16 np0005546909 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.400 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.434 187212 INFO nova.compute.manager [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Took 0.52 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.435 187212 DEBUG oslo.service.loopingcall [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.435 187212 DEBUG nova.compute.manager [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.436 187212 DEBUG nova.network.neutron [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.491 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk --force-share --output=json -f qcow2" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.492 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.548 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd/disk --force-share --output=json -f qcow2" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.563 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.621 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.622 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpkowsqvyf/db569d43e2da49c186fcade326705bc5.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.655 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpkowsqvyf/db569d43e2da49c186fcade326705bc5.delta 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.656 187212 INFO nova.virt.libvirt.driver [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.708 187212 DEBUG nova.virt.libvirt.guest [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.711 187212 INFO nova.virt.libvirt.driver [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.751 187212 DEBUG nova.privsep.utils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.752 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpkowsqvyf/db569d43e2da49c186fcade326705bc5.delta /var/lib/nova/instances/snapshots/tmpkowsqvyf/db569d43e2da49c186fcade326705bc5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.960 187212 DEBUG nova.network.neutron [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.975 187212 DEBUG nova.network.neutron [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.989 187212 DEBUG oslo_concurrency.processutils [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpkowsqvyf/db569d43e2da49c186fcade326705bc5.delta /var/lib/nova/instances/snapshots/tmpkowsqvyf/db569d43e2da49c186fcade326705bc5" returned: 0 in 0.237s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.990 187212 INFO nova.virt.libvirt.driver [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:05:16 np0005546909 nova_compute[187208]: 2025-12-05 12:05:16.994 187212 INFO nova.compute.manager [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Took 0.56 seconds to deallocate network for instance.#033[00m
Dec  5 07:05:17 np0005546909 nova_compute[187208]: 2025-12-05 12:05:17.052 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:17 np0005546909 nova_compute[187208]: 2025-12-05 12:05:17.052 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:17 np0005546909 nova_compute[187208]: 2025-12-05 12:05:17.155 187212 DEBUG nova.compute.provider_tree [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:17 np0005546909 nova_compute[187208]: 2025-12-05 12:05:17.171 187212 DEBUG nova.scheduler.client.report [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:17 np0005546909 nova_compute[187208]: 2025-12-05 12:05:17.195 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:17 np0005546909 nova_compute[187208]: 2025-12-05 12:05:17.213 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936302.2112913, 00262d23-bf60-44d9-a775-63ba32adaf96 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:17 np0005546909 nova_compute[187208]: 2025-12-05 12:05:17.214 187212 INFO nova.compute.manager [-] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:05:17 np0005546909 nova_compute[187208]: 2025-12-05 12:05:17.233 187212 INFO nova.scheduler.client.report [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Deleted allocations for instance f50947f2-f8d0-4d6b-bca4-b5412a206503#033[00m
Dec  5 07:05:17 np0005546909 nova_compute[187208]: 2025-12-05 12:05:17.238 187212 WARNING nova.compute.manager [None req-de3b2f95-d04d-4ebb-8bcc-ea3ff945639b a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Image not found during snapshot: nova.exception.ImageNotFound: Image f0688666-c4f9-4480-9537-e60553567a2b could not be found.#033[00m
Dec  5 07:05:17 np0005546909 nova_compute[187208]: 2025-12-05 12:05:17.241 187212 DEBUG nova.compute.manager [None req-f3ae35f3-12db-4735-a2ac-a5598e179a02 - - - - - -] [instance: 00262d23-bf60-44d9-a775-63ba32adaf96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:17 np0005546909 nova_compute[187208]: 2025-12-05 12:05:17.299 187212 DEBUG oslo_concurrency.lockutils [None req-ca95eec1-da62-490c-9fdf-80677beee88f 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "f50947f2-f8d0-4d6b-bca4-b5412a206503" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:17 np0005546909 nova_compute[187208]: 2025-12-05 12:05:17.596 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:18 np0005546909 nova_compute[187208]: 2025-12-05 12:05:18.224 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "004672c5-70e2-4940-bc9c-8971d94cc037" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:18 np0005546909 nova_compute[187208]: 2025-12-05 12:05:18.225 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "004672c5-70e2-4940-bc9c-8971d94cc037" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:18 np0005546909 nova_compute[187208]: 2025-12-05 12:05:18.225 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "004672c5-70e2-4940-bc9c-8971d94cc037-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:18 np0005546909 nova_compute[187208]: 2025-12-05 12:05:18.226 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "004672c5-70e2-4940-bc9c-8971d94cc037-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:18 np0005546909 nova_compute[187208]: 2025-12-05 12:05:18.226 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "004672c5-70e2-4940-bc9c-8971d94cc037-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:18 np0005546909 nova_compute[187208]: 2025-12-05 12:05:18.227 187212 INFO nova.compute.manager [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Terminating instance#033[00m
Dec  5 07:05:18 np0005546909 nova_compute[187208]: 2025-12-05 12:05:18.227 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "refresh_cache-004672c5-70e2-4940-bc9c-8971d94cc037" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:18 np0005546909 nova_compute[187208]: 2025-12-05 12:05:18.228 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquired lock "refresh_cache-004672c5-70e2-4940-bc9c-8971d94cc037" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:18 np0005546909 nova_compute[187208]: 2025-12-05 12:05:18.228 187212 DEBUG nova.network.neutron [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:05:18 np0005546909 podman[225033]: 2025-12-05 12:05:18.229194878 +0000 UTC m=+0.079376933 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 07:05:18 np0005546909 nova_compute[187208]: 2025-12-05 12:05:18.891 187212 DEBUG nova.network.neutron [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.468 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936304.4675956, c6a957dd-2181-4e92-9e06-e1a15fe5c307 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.469 187212 INFO nova.compute.manager [-] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.487 187212 DEBUG nova.compute.manager [None req-4875ea78-5ecf-46e6-b3c7-62f83fdf0a85 - - - - - -] [instance: c6a957dd-2181-4e92-9e06-e1a15fe5c307] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.541 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.800 187212 DEBUG nova.network.neutron [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.818 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Releasing lock "refresh_cache-004672c5-70e2-4940-bc9c-8971d94cc037" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.819 187212 DEBUG nova.compute.manager [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.943 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.944 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.944 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.945 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.945 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.946 187212 INFO nova.compute.manager [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Terminating instance#033[00m
Dec  5 07:05:19 np0005546909 nova_compute[187208]: 2025-12-05 12:05:19.947 187212 DEBUG nova.compute.manager [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:05:19 np0005546909 kernel: tapdb2c3297-b6 (unregistering): left promiscuous mode
Dec  5 07:05:19 np0005546909 NetworkManager[55691]: <info>  [1764936319.9722] device (tapdb2c3297-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:05:20 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:20Z|00436|binding|INFO|Releasing lport db2c3297-b6c8-4933-9328-102d81d6faa3 from this chassis (sb_readonly=0)
Dec  5 07:05:20 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:20Z|00437|binding|INFO|Setting lport db2c3297-b6c8-4933-9328-102d81d6faa3 down in Southbound
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.014 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:20 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:20Z|00438|binding|INFO|Removing iface tapdb2c3297-b6 ovn-installed in OVS
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.018 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.025 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:5d:24 10.100.0.5'], port_security=['fa:16:3e:66:5d:24 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ed7b6780-872e-41ef-a0c7-c48d0d6d13fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '43e63f5c6b0f4840ad4df23fb5c10764', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2438e27-7492-4e95-ae11-a6dff631eb7c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7810233-983e-4cb6-8e64-dd7fecfbdcd0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=db2c3297-b6c8-4933-9328-102d81d6faa3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.027 104471 INFO neutron.agent.ovn.metadata.agent [-] Port db2c3297-b6c8-4933-9328-102d81d6faa3 in datapath 41b3b495-c1c9-44c0-b1a3-a499df6548dd unbound from our chassis#033[00m
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.028 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41b3b495-c1c9-44c0-b1a3-a499df6548dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.028 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:20 np0005546909 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.030 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c98e2b3-77c0-4532-9f82-c59f33b712a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.031 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd namespace which is not needed anymore#033[00m
Dec  5 07:05:20 np0005546909 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000002d.scope: Consumed 14.328s CPU time.
Dec  5 07:05:20 np0005546909 systemd-machined[153543]: Machine qemu-47-instance-0000002d terminated.
Dec  5 07:05:20 np0005546909 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000035.scope: Deactivated successfully.
Dec  5 07:05:20 np0005546909 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000035.scope: Consumed 7.983s CPU time.
Dec  5 07:05:20 np0005546909 systemd-machined[153543]: Machine qemu-57-instance-00000035 terminated.
Dec  5 07:05:20 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [NOTICE]   (224986) : haproxy version is 2.8.14-c23fe91
Dec  5 07:05:20 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [NOTICE]   (224986) : path to executable is /usr/sbin/haproxy
Dec  5 07:05:20 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [WARNING]  (224986) : Exiting Master process...
Dec  5 07:05:20 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [WARNING]  (224986) : Exiting Master process...
Dec  5 07:05:20 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [ALERT]    (224986) : Current worker (224988) exited with code 143 (Terminated)
Dec  5 07:05:20 np0005546909 neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd[224982]: [WARNING]  (224986) : All workers exited. Exiting... (0)
Dec  5 07:05:20 np0005546909 systemd[1]: libpod-d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc.scope: Deactivated successfully.
Dec  5 07:05:20 np0005546909 podman[225078]: 2025-12-05 12:05:20.178564679 +0000 UTC m=+0.051979545 container died d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.219 187212 INFO nova.virt.libvirt.driver [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Instance destroyed successfully.#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.220 187212 DEBUG nova.objects.instance [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lazy-loading 'resources' on Instance uuid ed7b6780-872e-41ef-a0c7-c48d0d6d13fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:20 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc-userdata-shm.mount: Deactivated successfully.
Dec  5 07:05:20 np0005546909 systemd[1]: var-lib-containers-storage-overlay-3400c5b4535476a6d33ef667b47e2b8030edbf230d73171381716b0df7dfcda6-merged.mount: Deactivated successfully.
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.234 187212 DEBUG nova.virt.libvirt.vif [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-465631494',display_name='tempest-ImagesTestJSON-server-465631494',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-465631494',id=53,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='43e63f5c6b0f4840ad4df23fb5c10764',ramdisk_id='',reservation_id='r-j61263vr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-276789408',owner_user_name='tempest-ImagesTestJSON-276789408-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:05:17Z,user_data=None,user_id='a00ac4435e6647779ffaf4a5cde18fdb',uuid=ed7b6780-872e-41ef-a0c7-c48d0d6d13fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.235 187212 DEBUG nova.network.os_vif_util [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converting VIF {"id": "db2c3297-b6c8-4933-9328-102d81d6faa3", "address": "fa:16:3e:66:5d:24", "network": {"id": "41b3b495-c1c9-44c0-b1a3-a499df6548dd", "bridge": "br-int", "label": "tempest-ImagesTestJSON-100455550-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "43e63f5c6b0f4840ad4df23fb5c10764", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb2c3297-b6", "ovs_interfaceid": "db2c3297-b6c8-4933-9328-102d81d6faa3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:20 np0005546909 podman[225078]: 2025-12-05 12:05:20.236132204 +0000 UTC m=+0.109547070 container cleanup d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.236 187212 DEBUG nova.network.os_vif_util [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.236 187212 DEBUG os_vif [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.240 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb2c3297-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.242 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.244 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:20 np0005546909 systemd[1]: libpod-conmon-d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc.scope: Deactivated successfully.
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.247 187212 INFO os_vif [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:5d:24,bridge_name='br-int',has_traffic_filtering=True,id=db2c3297-b6c8-4933-9328-102d81d6faa3,network=Network(41b3b495-c1c9-44c0-b1a3-a499df6548dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb2c3297-b6')#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.247 187212 INFO nova.virt.libvirt.driver [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Deleting instance files /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd_del#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.248 187212 INFO nova.virt.libvirt.driver [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Deletion of /var/lib/nova/instances/ed7b6780-872e-41ef-a0c7-c48d0d6d13fd_del complete#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.279 187212 INFO nova.virt.libvirt.driver [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Instance destroyed successfully.#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.280 187212 DEBUG nova.objects.instance [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lazy-loading 'resources' on Instance uuid 004672c5-70e2-4940-bc9c-8971d94cc037 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.311 187212 INFO nova.virt.libvirt.driver [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Deleting instance files /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037_del#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.312 187212 INFO nova.virt.libvirt.driver [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Deletion of /var/lib/nova/instances/004672c5-70e2-4940-bc9c-8971d94cc037_del complete#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.320 187212 INFO nova.compute.manager [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.321 187212 DEBUG oslo.service.loopingcall [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.321 187212 DEBUG nova.compute.manager [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.321 187212 DEBUG nova.network.neutron [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:05:20 np0005546909 podman[225125]: 2025-12-05 12:05:20.335785579 +0000 UTC m=+0.074686128 container remove d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.341 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[70fba4e0-1792-4ec6-99fc-76e24c20babe]: (4, ('Fri Dec  5 12:05:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc)\nd7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc\nFri Dec  5 12:05:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd (d7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc)\nd7c0306e2a10db8e470c4fab2a48fd1ce47b66b0541ade98b866f841d6191ddc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.343 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bec9ecb1-8ec8-44a2-93be-7d322917d560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.345 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b3b495-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.347 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:20 np0005546909 kernel: tap41b3b495-c0: left promiscuous mode
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.361 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.365 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[450f11f3-60ec-46cf-b9a7-4d4828e653d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.368 187212 INFO nova.compute.manager [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Took 0.55 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.369 187212 DEBUG oslo.service.loopingcall [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.369 187212 DEBUG nova.compute.manager [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.369 187212 DEBUG nova.network.neutron [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.381 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6a3599aa-920e-45f3-822f-3cb5806b1c2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.383 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[45973862-0c5b-4f52-9aa5-284367ff499b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.400 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[654cb02d-0d98-4bff-913a-403dec658e18]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369112, 'reachable_time': 20273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225145, 'error': None, 'target': 'ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:20 np0005546909 systemd[1]: run-netns-ovnmeta\x2d41b3b495\x2dc1c9\x2d44c0\x2db1a3\x2da499df6548dd.mount: Deactivated successfully.
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.405 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-41b3b495-c1c9-44c0-b1a3-a499df6548dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:05:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:20.405 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[59525c24-db28-4949-9df6-983a67a38410]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.538 187212 DEBUG nova.network.neutron [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.552 187212 DEBUG nova.network.neutron [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.567 187212 INFO nova.compute.manager [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Took 0.20 seconds to deallocate network for instance.#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.605 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936305.6047473, 97020786-7ba5-4c8b-8a2c-838c0f663bb4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.606 187212 INFO nova.compute.manager [-] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.647 187212 DEBUG nova.compute.manager [None req-c06b482d-58d5-4d25-a7cb-b2919a6e66c0 - - - - - -] [instance: 97020786-7ba5-4c8b-8a2c-838c0f663bb4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.657 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.657 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.724 187212 DEBUG nova.compute.provider_tree [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.744 187212 DEBUG nova.scheduler.client.report [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.766 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.795 187212 INFO nova.scheduler.client.report [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Deleted allocations for instance 004672c5-70e2-4940-bc9c-8971d94cc037#033[00m
Dec  5 07:05:20 np0005546909 nova_compute[187208]: 2025-12-05 12:05:20.860 187212 DEBUG oslo_concurrency.lockutils [None req-a588a9c5-438e-4623-b671-0dac99b7d5f5 8456efa356654e5c990efa4aef688e8a 42d9566206cb469ebd803d0600019533 - - default default] Lock "004672c5-70e2-4940-bc9c-8971d94cc037" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:22 np0005546909 nova_compute[187208]: 2025-12-05 12:05:22.509 187212 DEBUG nova.network.neutron [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:22 np0005546909 nova_compute[187208]: 2025-12-05 12:05:22.525 187212 INFO nova.compute.manager [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Took 2.20 seconds to deallocate network for instance.#033[00m
Dec  5 07:05:22 np0005546909 nova_compute[187208]: 2025-12-05 12:05:22.571 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:22 np0005546909 nova_compute[187208]: 2025-12-05 12:05:22.572 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:22 np0005546909 nova_compute[187208]: 2025-12-05 12:05:22.598 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:22 np0005546909 nova_compute[187208]: 2025-12-05 12:05:22.642 187212 DEBUG nova.compute.provider_tree [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:22 np0005546909 nova_compute[187208]: 2025-12-05 12:05:22.652 187212 DEBUG nova.compute.manager [req-de87d5d7-dd4e-427b-b89e-d9e9f9be1400 req-02c953b2-433d-4031-b307-19a506231703 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Received event network-vif-deleted-db2c3297-b6c8-4933-9328-102d81d6faa3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:22 np0005546909 nova_compute[187208]: 2025-12-05 12:05:22.656 187212 DEBUG nova.scheduler.client.report [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:22 np0005546909 nova_compute[187208]: 2025-12-05 12:05:22.673 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:22 np0005546909 nova_compute[187208]: 2025-12-05 12:05:22.699 187212 INFO nova.scheduler.client.report [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Deleted allocations for instance ed7b6780-872e-41ef-a0c7-c48d0d6d13fd#033[00m
Dec  5 07:05:22 np0005546909 nova_compute[187208]: 2025-12-05 12:05:22.757 187212 DEBUG oslo_concurrency.lockutils [None req-13f388ba-fb7a-435d-b9b9-d7159c0cf817 a00ac4435e6647779ffaf4a5cde18fdb 43e63f5c6b0f4840ad4df23fb5c10764 - - default default] Lock "ed7b6780-872e-41ef-a0c7-c48d0d6d13fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.813s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:25 np0005546909 podman[225146]: 2025-12-05 12:05:25.224832207 +0000 UTC m=+0.073008330 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7)
Dec  5 07:05:25 np0005546909 podman[225147]: 2025-12-05 12:05:25.243334679 +0000 UTC m=+0.086172579 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:05:25 np0005546909 nova_compute[187208]: 2025-12-05 12:05:25.243 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:27 np0005546909 nova_compute[187208]: 2025-12-05 12:05:27.601 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:30 np0005546909 nova_compute[187208]: 2025-12-05 12:05:30.182 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:30 np0005546909 nova_compute[187208]: 2025-12-05 12:05:30.245 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:30 np0005546909 podman[225188]: 2025-12-05 12:05:30.253937115 +0000 UTC m=+0.052607033 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:05:30 np0005546909 podman[225189]: 2025-12-05 12:05:30.276800383 +0000 UTC m=+0.073543436 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:05:30 np0005546909 nova_compute[187208]: 2025-12-05 12:05:30.840 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:30 np0005546909 nova_compute[187208]: 2025-12-05 12:05:30.841 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:30 np0005546909 nova_compute[187208]: 2025-12-05 12:05:30.857 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:05:30 np0005546909 nova_compute[187208]: 2025-12-05 12:05:30.981 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:30 np0005546909 nova_compute[187208]: 2025-12-05 12:05:30.982 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:30 np0005546909 nova_compute[187208]: 2025-12-05 12:05:30.988 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:05:30 np0005546909 nova_compute[187208]: 2025-12-05 12:05:30.989 187212 INFO nova.compute.claims [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.167 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936316.1663022, f50947f2-f8d0-4d6b-bca4-b5412a206503 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.167 187212 INFO nova.compute.manager [-] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.180 187212 DEBUG nova.compute.provider_tree [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.306 187212 DEBUG nova.compute.manager [None req-7301aa01-26a2-4daf-bc9f-57f43ebe6a4b - - - - - -] [instance: f50947f2-f8d0-4d6b-bca4-b5412a206503] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.307 187212 DEBUG nova.scheduler.client.report [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.341 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.342 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.487 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.488 187212 DEBUG nova.network.neutron [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.533 187212 INFO nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.556 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.676 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.677 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.677 187212 INFO nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Creating image(s)#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.678 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.678 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.679 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.694 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.764 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.765 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.765 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.775 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.828 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.829 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.851 187212 DEBUG nova.policy [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.863 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.864 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.865 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.924 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.925 187212 DEBUG nova.virt.disk.api [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Checking if we can resize image /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.925 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.979 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.980 187212 DEBUG nova.virt.disk.api [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Cannot resize image /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.980 187212 DEBUG nova.objects.instance [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'migration_context' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.997 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.997 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Ensure instance console log exists: /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.998 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.998 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:31 np0005546909 nova_compute[187208]: 2025-12-05 12:05:31.998 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:32 np0005546909 nova_compute[187208]: 2025-12-05 12:05:32.603 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.049 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.049 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.065 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.121 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.122 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.127 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.128 187212 INFO nova.compute.claims [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.231 187212 DEBUG nova.compute.provider_tree [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.242 187212 DEBUG nova.scheduler.client.report [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.258 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.259 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.298 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.298 187212 DEBUG nova.network.neutron [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.315 187212 INFO nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.332 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.413 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.414 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.414 187212 INFO nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Creating image(s)#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.415 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.415 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.416 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.427 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.485 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.486 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.487 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.498 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.560 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.561 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.595 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.596 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.597 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.663 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.665 187212 DEBUG nova.virt.disk.api [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Checking if we can resize image /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.665 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.720 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.721 187212 DEBUG nova.virt.disk.api [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Cannot resize image /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.721 187212 DEBUG nova.objects.instance [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lazy-loading 'migration_context' on Instance uuid 472c7e2c-bdad-4230-904b-6937ceb872d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.736 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.737 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Ensure instance console log exists: /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.737 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.737 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.738 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:33 np0005546909 nova_compute[187208]: 2025-12-05 12:05:33.973 187212 DEBUG nova.network.neutron [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Successfully created port: 2e9efd6c-740c-405b-b9f0-bd46434070a7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:05:34 np0005546909 nova_compute[187208]: 2025-12-05 12:05:34.043 187212 DEBUG nova.policy [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.117 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.118 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.138 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:05:35 np0005546909 podman[225266]: 2025-12-05 12:05:35.203198928 +0000 UTC m=+0.055537277 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.212 187212 DEBUG nova.network.neutron [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Successfully created port: 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.215 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936320.2146416, ed7b6780-872e-41ef-a0c7-c48d0d6d13fd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.215 187212 INFO nova.compute.manager [-] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.223 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.223 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.230 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.230 187212 INFO nova.compute.claims [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.235 187212 DEBUG nova.compute.manager [None req-99fe8518-2502-4122-8c88-1c5984204fe7 - - - - - -] [instance: ed7b6780-872e-41ef-a0c7-c48d0d6d13fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.246 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.279 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936320.2777467, 004672c5-70e2-4940-bc9c-8971d94cc037 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.279 187212 INFO nova.compute.manager [-] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.304 187212 DEBUG nova.compute.manager [None req-abe09de8-7e1e-4102-a1f1-f812a840ab91 - - - - - -] [instance: 004672c5-70e2-4940-bc9c-8971d94cc037] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.377 187212 DEBUG nova.compute.provider_tree [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.379 187212 DEBUG nova.network.neutron [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Successfully updated port: 2e9efd6c-740c-405b-b9f0-bd46434070a7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.397 187212 DEBUG nova.scheduler.client.report [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.400 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.400 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.400 187212 DEBUG nova.network.neutron [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.422 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.423 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.596 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.597 187212 DEBUG nova.network.neutron [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.617 187212 INFO nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.631 187212 DEBUG nova.network.neutron [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.640 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.732 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.734 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.734 187212 INFO nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Creating image(s)#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.735 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.735 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.736 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.753 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.820 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.821 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.821 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.833 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.890 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.891 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.924 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.926 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.927 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.986 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.988 187212 DEBUG nova.virt.disk.api [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Checking if we can resize image /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:05:35 np0005546909 nova_compute[187208]: 2025-12-05 12:05:35.989 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:36 np0005546909 nova_compute[187208]: 2025-12-05 12:05:36.048 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:36 np0005546909 nova_compute[187208]: 2025-12-05 12:05:36.049 187212 DEBUG nova.virt.disk.api [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Cannot resize image /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:05:36 np0005546909 nova_compute[187208]: 2025-12-05 12:05:36.050 187212 DEBUG nova.objects.instance [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'migration_context' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:36 np0005546909 nova_compute[187208]: 2025-12-05 12:05:36.062 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:05:36 np0005546909 nova_compute[187208]: 2025-12-05 12:05:36.063 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Ensure instance console log exists: /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:05:36 np0005546909 nova_compute[187208]: 2025-12-05 12:05:36.063 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:36 np0005546909 nova_compute[187208]: 2025-12-05 12:05:36.064 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:36 np0005546909 nova_compute[187208]: 2025-12-05 12:05:36.064 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:36 np0005546909 nova_compute[187208]: 2025-12-05 12:05:36.117 187212 DEBUG nova.policy [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:05:36 np0005546909 nova_compute[187208]: 2025-12-05 12:05:36.331 187212 DEBUG nova.compute.manager [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-changed-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:36 np0005546909 nova_compute[187208]: 2025-12-05 12:05:36.332 187212 DEBUG nova.compute.manager [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Refreshing instance network info cache due to event network-changed-2e9efd6c-740c-405b-b9f0-bd46434070a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:05:36 np0005546909 nova_compute[187208]: 2025-12-05 12:05:36.333 187212 DEBUG oslo_concurrency.lockutils [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.084 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.084 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.084 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.085 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.419 187212 DEBUG nova.network.neutron [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Successfully updated port: 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.432 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.432 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquired lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.432 187212 DEBUG nova.network.neutron [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.507 187212 DEBUG nova.network.neutron [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.536 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.537 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance network_info: |[{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.537 187212 DEBUG oslo_concurrency.lockutils [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.538 187212 DEBUG nova.network.neutron [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Refreshing network info cache for port 2e9efd6c-740c-405b-b9f0-bd46434070a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.540 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Start _get_guest_xml network_info=[{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.547 187212 WARNING nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.551 187212 DEBUG nova.virt.libvirt.host [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.552 187212 DEBUG nova.virt.libvirt.host [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.555 187212 DEBUG nova.virt.libvirt.host [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.555 187212 DEBUG nova.virt.libvirt.host [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.555 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.556 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.556 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.556 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.556 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.557 187212 DEBUG nova.virt.hardware [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.561 187212 DEBUG nova.virt.libvirt.vif [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629320086',display_name='tempest-ServerActionsTestOtherB-server-1629320086',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629320086',id=54,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCLIKVsEL0lmma4upWYe8NiCB7ZJxacCmu4vu1RJu3M5/Fu5S7w/HUSIKvvOTrl/9nUJ4pE5tXIAyPQxQDsptmV5i8IinhFeAgIm0GlEBvfbCuuhpWud8F+u8GsIwgaqpQ==',key_name='tempest-keypair-776546213',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-l59qc6ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=24358eea-14fb-4863-a6c4-aadcdb495f54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.561 187212 DEBUG nova.network.os_vif_util [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.562 187212 DEBUG nova.network.os_vif_util [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.563 187212 DEBUG nova.objects.instance [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.576 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  <uuid>24358eea-14fb-4863-a6c4-aadcdb495f54</uuid>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  <name>instance-00000036</name>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerActionsTestOtherB-server-1629320086</nova:name>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:05:37</nova:creationTime>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:        <nova:user uuid="4ad1281afc874c0ca55d908d3a6e05a8">tempest-ServerActionsTestOtherB-1759520420-project-member</nova:user>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:        <nova:project uuid="58cbd93e463049988ccd6d013893e7d6">tempest-ServerActionsTestOtherB-1759520420</nova:project>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:        <nova:port uuid="2e9efd6c-740c-405b-b9f0-bd46434070a7">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <entry name="serial">24358eea-14fb-4863-a6c4-aadcdb495f54</entry>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <entry name="uuid">24358eea-14fb-4863-a6c4-aadcdb495f54</entry>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:ab:5e:ef"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <target dev="tap2e9efd6c-74"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/console.log" append="off"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:05:37 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:05:37 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:05:37 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:05:37 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.577 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Preparing to wait for external event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.578 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.578 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.578 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.579 187212 DEBUG nova.virt.libvirt.vif [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629320086',display_name='tempest-ServerActionsTestOtherB-server-1629320086',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629320086',id=54,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCLIKVsEL0lmma4upWYe8NiCB7ZJxacCmu4vu1RJu3M5/Fu5S7w/HUSIKvvOTrl/9nUJ4pE5tXIAyPQxQDsptmV5i8IinhFeAgIm0GlEBvfbCuuhpWud8F+u8GsIwgaqpQ==',key_name='tempest-keypair-776546213',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-l59qc6ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=24358eea-14fb-4863-a6c4-aadcdb495f54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.579 187212 DEBUG nova.network.os_vif_util [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.579 187212 DEBUG nova.network.os_vif_util [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.580 187212 DEBUG os_vif [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.580 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.581 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.581 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.584 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.584 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e9efd6c-74, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.584 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2e9efd6c-74, col_values=(('external_ids', {'iface-id': '2e9efd6c-740c-405b-b9f0-bd46434070a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:5e:ef', 'vm-uuid': '24358eea-14fb-4863-a6c4-aadcdb495f54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.586 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:37 np0005546909 NetworkManager[55691]: <info>  [1764936337.5871] manager: (tap2e9efd6c-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.588 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.591 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.592 187212 INFO os_vif [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74')#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.603 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.651 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.651 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.651 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No VIF found with MAC fa:16:3e:ab:5e:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.652 187212 INFO nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Using config drive#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.686 187212 DEBUG nova.network.neutron [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Successfully created port: 549318e9-e629-4e2c-8cbb-3cd263c2bc34 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:05:37 np0005546909 nova_compute[187208]: 2025-12-05 12:05:37.826 187212 DEBUG nova.network.neutron [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.069 187212 INFO nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Creating config drive at /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.075 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp60k8d0z5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.153 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.153 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.181 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.206 187212 DEBUG oslo_concurrency.processutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp60k8d0z5" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:38 np0005546909 kernel: tap2e9efd6c-74: entered promiscuous mode
Dec  5 07:05:38 np0005546909 NetworkManager[55691]: <info>  [1764936338.2886] manager: (tap2e9efd6c-74): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Dec  5 07:05:38 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:38Z|00439|binding|INFO|Claiming lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 for this chassis.
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.290 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:38 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:38Z|00440|binding|INFO|2e9efd6c-740c-405b-b9f0-bd46434070a7: Claiming fa:16:3e:ab:5e:ef 10.100.0.5
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.290 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.290 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.300 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.300 187212 INFO nova.compute.claims [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.303 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:5e:ef 10.100.0.5'], port_security=['fa:16:3e:ab:5e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'def128bf-31aa-408f-b463-573b7d555296', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2e9efd6c-740c-405b-b9f0-bd46434070a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.305 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2e9efd6c-740c-405b-b9f0-bd46434070a7 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 bound to our chassis#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.307 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363#033[00m
Dec  5 07:05:38 np0005546909 systemd-udevd[225319]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.319 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[646ed345-13e7-4612-8c40-140cc801e53b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.321 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb5c17e5c-21 in ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.323 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb5c17e5c-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.324 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7115a235-62ab-4616-815e-8151d0ded9f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.324 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc01977-88b9-482a-905c-0d3d55b1a0cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 NetworkManager[55691]: <info>  [1764936338.3365] device (tap2e9efd6c-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:05:38 np0005546909 NetworkManager[55691]: <info>  [1764936338.3377] device (tap2e9efd6c-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.337 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8d01b5bc-88d4-4242-9429-5095d79bcd41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 systemd-machined[153543]: New machine qemu-58-instance-00000036.
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.349 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.352 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3fafc3ef-2ec3-44e0-9333-b418125cf30c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:38Z|00441|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 ovn-installed in OVS
Dec  5 07:05:38 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:38Z|00442|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 up in Southbound
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:38 np0005546909 systemd[1]: Started Virtual Machine qemu-58-instance-00000036.
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.384 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d77cf07f-cff2-41e3-bf75-50638c2b7d8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 NetworkManager[55691]: <info>  [1764936338.3946] manager: (tapb5c17e5c-20): new Veth device (/org/freedesktop/NetworkManager/Devices/181)
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.394 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f0331eb-2012-4c8f-b668-2ff0e378b4f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.427 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[729aad2f-cb1a-4cf5-b189-17a17c840b7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.429 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f0bd35-7b2c-489f-b02e-9ae79034ca2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:05:38 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:05:38 np0005546909 NetworkManager[55691]: <info>  [1764936338.4505] device (tapb5c17e5c-20): carrier: link connected
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.456 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[114f970c-ff98-45a3-b51f-d6bb44fdc42b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.460 187212 DEBUG nova.compute.provider_tree [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.471 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6990bb94-e38c-47d3-8b1c-ee8af83b5d4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 22378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225357, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.475 187212 DEBUG nova.scheduler.client.report [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.485 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ecde5040-746c-48c5-8d96-3b39ac491f74]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe84:429f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371901, 'tstamp': 371901}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225358, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.499 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[474a3ebf-e833-4b25-aac4-8e43d1b92af9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 22378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225359, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.501 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.502 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.528 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[390ee6c2-a1d9-4da0-91c1-46fa7809f028]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.548 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.549 187212 DEBUG nova.network.neutron [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.569 187212 INFO nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.590 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.590 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[42b2f51f-3aa7-4284-af67-45a2c749ea1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.592 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.592 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.593 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.595 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:38 np0005546909 kernel: tapb5c17e5c-20: entered promiscuous mode
Dec  5 07:05:38 np0005546909 NetworkManager[55691]: <info>  [1764936338.5958] manager: (tapb5c17e5c-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.600 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:38 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:38Z|00443|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.601 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.602 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5c17e5c-2b6c-48d3-9992-ac34070e3363.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5c17e5c-2b6c-48d3-9992-ac34070e3363.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.603 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[639d3043-f916-47cc-bf0e-2b0c3f71dbaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.604 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/b5c17e5c-2b6c-48d3-9992-ac34070e3363.pid.haproxy
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID b5c17e5c-2b6c-48d3-9992-ac34070e3363
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:05:38 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:38.605 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'env', 'PROCESS_TAG=haproxy-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b5c17e5c-2b6c-48d3-9992-ac34070e3363.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.613 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.712 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.714 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.715 187212 INFO nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Creating image(s)#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.715 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "/var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.715 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.716 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.729 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.788 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.789 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.790 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.802 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.859 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.861 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.894 187212 DEBUG nova.policy [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.901 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.902 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.902 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.968 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.969 187212 DEBUG nova.virt.disk.api [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Checking if we can resize image /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:05:38 np0005546909 nova_compute[187208]: 2025-12-05 12:05:38.969 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:38 np0005546909 podman[225399]: 2025-12-05 12:05:38.972457568 +0000 UTC m=+0.044974044 container create 65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  5 07:05:39 np0005546909 systemd[1]: Started libpod-conmon-65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5.scope.
Dec  5 07:05:39 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:05:39 np0005546909 nova_compute[187208]: 2025-12-05 12:05:39.036 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:39 np0005546909 nova_compute[187208]: 2025-12-05 12:05:39.037 187212 DEBUG nova.virt.disk.api [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Cannot resize image /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:05:39 np0005546909 nova_compute[187208]: 2025-12-05 12:05:39.038 187212 DEBUG nova.objects.instance [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'migration_context' on Instance uuid 8888dd78-1c78-4065-8536-9a1096bdf57b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:39 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93f32fdd3ea0552c523abfc1a627c1ddf05c35a6f969e26671c37410720e74dc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:05:39 np0005546909 podman[225399]: 2025-12-05 12:05:38.947001746 +0000 UTC m=+0.019518262 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:05:39 np0005546909 nova_compute[187208]: 2025-12-05 12:05:39.053 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:05:39 np0005546909 nova_compute[187208]: 2025-12-05 12:05:39.053 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Ensure instance console log exists: /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:05:39 np0005546909 nova_compute[187208]: 2025-12-05 12:05:39.054 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:39 np0005546909 nova_compute[187208]: 2025-12-05 12:05:39.055 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:39 np0005546909 nova_compute[187208]: 2025-12-05 12:05:39.055 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:39 np0005546909 nova_compute[187208]: 2025-12-05 12:05:39.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:05:39 np0005546909 podman[225399]: 2025-12-05 12:05:39.068387585 +0000 UTC m=+0.140904101 container init 65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:05:39 np0005546909 podman[225399]: 2025-12-05 12:05:39.074441859 +0000 UTC m=+0.146958365 container start 65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  5 07:05:39 np0005546909 nova_compute[187208]: 2025-12-05 12:05:39.086 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:05:39 np0005546909 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [NOTICE]   (225423) : New worker (225425) forked
Dec  5 07:05:39 np0005546909 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [NOTICE]   (225423) : Loading success.
Dec  5 07:05:39 np0005546909 nova_compute[187208]: 2025-12-05 12:05:39.626 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936339.6255472, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:39 np0005546909 nova_compute[187208]: 2025-12-05 12:05:39.627 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Started (Lifecycle Event)#033[00m
Dec  5 07:05:40 np0005546909 nova_compute[187208]: 2025-12-05 12:05:40.075 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:40 np0005546909 nova_compute[187208]: 2025-12-05 12:05:40.079 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936339.6265035, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:40 np0005546909 nova_compute[187208]: 2025-12-05 12:05:40.079 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:05:40 np0005546909 nova_compute[187208]: 2025-12-05 12:05:40.097 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:40 np0005546909 nova_compute[187208]: 2025-12-05 12:05:40.101 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:05:40 np0005546909 nova_compute[187208]: 2025-12-05 12:05:40.122 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:05:40 np0005546909 nova_compute[187208]: 2025-12-05 12:05:40.242 187212 DEBUG nova.network.neutron [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Successfully created port: c5cb68aa-e5c2-48b0-b9c4-e0542120e065 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.114 187212 DEBUG nova.network.neutron [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.137 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Releasing lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.138 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Instance network_info: |[{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.141 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Start _get_guest_xml network_info=[{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.147 187212 WARNING nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.153 187212 DEBUG nova.virt.libvirt.host [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.154 187212 DEBUG nova.virt.libvirt.host [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.157 187212 DEBUG nova.virt.libvirt.host [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.158 187212 DEBUG nova.virt.libvirt.host [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.158 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.158 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.159 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.159 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.159 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.160 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.160 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.160 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.160 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.160 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.161 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.161 187212 DEBUG nova.virt.hardware [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.165 187212 DEBUG nova.virt.libvirt.vif [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-292918791',display_name='tempest-FloatingIPsAssociationTestJSON-server-292918791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-292918791',id=55,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85037de7275442698e604ee3f6283cbc',ramdisk_id='',reservation_id='r-c3vnhg04',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-883508882',owner_user_name='tempest-FloatingIPsAssociationTestJSON-883508882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:33Z,user_data=None,user_id='8cf2534e7c394130b675e44ed567401b',uuid=472c7e2c-bdad-4230-904b-6937ceb872d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.165 187212 DEBUG nova.network.os_vif_util [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converting VIF {"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.166 187212 DEBUG nova.network.os_vif_util [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.166 187212 DEBUG nova.objects.instance [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lazy-loading 'pci_devices' on Instance uuid 472c7e2c-bdad-4230-904b-6937ceb872d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.181 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  <uuid>472c7e2c-bdad-4230-904b-6937ceb872d2</uuid>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  <name>instance-00000037</name>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <nova:name>tempest-FloatingIPsAssociationTestJSON-server-292918791</nova:name>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:05:41</nova:creationTime>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:        <nova:user uuid="8cf2534e7c394130b675e44ed567401b">tempest-FloatingIPsAssociationTestJSON-883508882-project-member</nova:user>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:        <nova:project uuid="85037de7275442698e604ee3f6283cbc">tempest-FloatingIPsAssociationTestJSON-883508882</nova:project>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:        <nova:port uuid="9357c6a6-eb6f-4ab9-bfd6-486765004ac5">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <entry name="serial">472c7e2c-bdad-4230-904b-6937ceb872d2</entry>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <entry name="uuid">472c7e2c-bdad-4230-904b-6937ceb872d2</entry>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.config"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:08:e8:08"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <target dev="tap9357c6a6-eb"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/console.log" append="off"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:05:41 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:05:41 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:05:41 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:05:41 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.183 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Preparing to wait for external event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.183 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.183 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.184 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.185 187212 DEBUG nova.virt.libvirt.vif [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-292918791',display_name='tempest-FloatingIPsAssociationTestJSON-server-292918791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-292918791',id=55,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85037de7275442698e604ee3f6283cbc',ramdisk_id='',reservation_id='r-c3vnhg04',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-883508882',owner_user_name='tempest-FloatingIPsAssociationTestJSON-883508882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:33Z,user_data=None,user_id='8cf2534e7c394130b675e44ed567401b',uuid=472c7e2c-bdad-4230-904b-6937ceb872d2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.185 187212 DEBUG nova.network.os_vif_util [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converting VIF {"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.186 187212 DEBUG nova.network.os_vif_util [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.186 187212 DEBUG os_vif [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.187 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.188 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.188 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.191 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.192 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9357c6a6-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.192 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9357c6a6-eb, col_values=(('external_ids', {'iface-id': '9357c6a6-eb6f-4ab9-bfd6-486765004ac5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:08:e8:08', 'vm-uuid': '472c7e2c-bdad-4230-904b-6937ceb872d2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:41 np0005546909 NetworkManager[55691]: <info>  [1764936341.1951] manager: (tap9357c6a6-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.195 187212 DEBUG nova.network.neutron [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updated VIF entry in instance network info cache for port 2e9efd6c-740c-405b-b9f0-bd46434070a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.196 187212 DEBUG nova.network.neutron [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.199 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.204 187212 INFO os_vif [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb')#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.219 187212 DEBUG oslo_concurrency.lockutils [req-dee4d04d-6eea-49af-8402-e07c5f5b35df req-646dcf53-7ce4-4456-959f-3cf991760eb2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:41.233 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.233 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:41.235 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:05:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:41.236 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.266 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.267 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.267 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] No VIF found with MAC fa:16:3e:08:e8:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.267 187212 INFO nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Using config drive#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.298 187212 DEBUG nova.network.neutron [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Successfully updated port: 549318e9-e629-4e2c-8cbb-3cd263c2bc34 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.302 187212 DEBUG nova.compute.manager [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.302 187212 DEBUG nova.compute.manager [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing instance network info cache due to event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.302 187212 DEBUG oslo_concurrency.lockutils [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.303 187212 DEBUG oslo_concurrency.lockutils [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.303 187212 DEBUG nova.network.neutron [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.321 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.321 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquired lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:41 np0005546909 nova_compute[187208]: 2025-12-05 12:05:41.321 187212 DEBUG nova.network.neutron [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.040 187212 DEBUG nova.network.neutron [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.054 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.059 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.133 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.134 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.135 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.135 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.204 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:42 np0005546909 podman[225444]: 2025-12-05 12:05:42.208260742 +0000 UTC m=+0.060635114 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.266 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.267 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.323 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.329 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.399 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.400 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.424 187212 DEBUG nova.network.neutron [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Successfully updated port: c5cb68aa-e5c2-48b0-b9c4-e0542120e065 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.459 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.464 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000037, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.config'#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.468 187212 INFO nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Creating config drive at /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.config#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.473 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqe8e12u7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.600 187212 DEBUG oslo_concurrency.processutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqe8e12u7" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.606 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.622 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "refresh_cache-8888dd78-1c78-4065-8536-9a1096bdf57b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.622 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquired lock "refresh_cache-8888dd78-1c78-4065-8536-9a1096bdf57b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.623 187212 DEBUG nova.network.neutron [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:05:42 np0005546909 kernel: tap9357c6a6-eb: entered promiscuous mode
Dec  5 07:05:42 np0005546909 NetworkManager[55691]: <info>  [1764936342.6671] manager: (tap9357c6a6-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/184)
Dec  5 07:05:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:42Z|00444|binding|INFO|Claiming lport 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 for this chassis.
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.669 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:42Z|00445|binding|INFO|9357c6a6-eb6f-4ab9-bfd6-486765004ac5: Claiming fa:16:3e:08:e8:08 10.100.0.14
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.673 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.682 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:e8:08 10.100.0.14'], port_security=['fa:16:3e:08:e8:08 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f4c4888-4b32-4259-8441-31af091e0c7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85037de7275442698e604ee3f6283cbc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed3fff5f-a24a-492e-ba85-8f010d446cfc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac2e7e6b-9342-46f8-a910-5de5a261f0a9, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9357c6a6-eb6f-4ab9-bfd6-486765004ac5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.683 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 in datapath 0f4c4888-4b32-4259-8441-31af091e0c7d bound to our chassis#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.684 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0f4c4888-4b32-4259-8441-31af091e0c7d#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.694 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[44607439-04ab-40da-8182-0a822c12dd74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.695 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0f4c4888-41 in ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:05:42 np0005546909 systemd-udevd[225498]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.697 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0f4c4888-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.697 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eda87d39-d940-4b9b-bd28-74db3366a334]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.700 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b8cbbf-fce2-4832-94a2-f9c011fdc411]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 NetworkManager[55691]: <info>  [1764936342.7117] device (tap9357c6a6-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:05:42 np0005546909 NetworkManager[55691]: <info>  [1764936342.7130] device (tap9357c6a6-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.712 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[614786d7-edeb-4ac4-a189-f184d35d8fdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 systemd-machined[153543]: New machine qemu-59-instance-00000037.
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.722 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:42Z|00446|binding|INFO|Setting lport 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 ovn-installed in OVS
Dec  5 07:05:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:42Z|00447|binding|INFO|Setting lport 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 up in Southbound
Dec  5 07:05:42 np0005546909 systemd[1]: Started Virtual Machine qemu-59-instance-00000037.
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.728 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.728 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[97563db9-4daa-4faa-8375-05cafdcc2adb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.742 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.743 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5600MB free_disk=73.25677871704102GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.744 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.744 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.757 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[08f8ceee-a629-45f1-b07b-e131cda7484d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 systemd-udevd[225503]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:05:42 np0005546909 NetworkManager[55691]: <info>  [1764936342.7639] manager: (tap0f4c4888-40): new Veth device (/org/freedesktop/NetworkManager/Devices/185)
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.763 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[17e8c18e-7079-457c-8c90-f131f5cc13df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.791 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[babb764e-a620-4ee9-8273-df70ef7ecb84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.794 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5bf58116-1f55-4b49-950e-1350b7e8ca25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 NetworkManager[55691]: <info>  [1764936342.8146] device (tap0f4c4888-40): carrier: link connected
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.819 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[22d93e31-f2bf-477a-a049-2a1f52ec51af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.837 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b4fe484b-b450-4823-b67e-e54346ae2797]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f4c4888-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:45:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372337, 'reachable_time': 25070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225532, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.852 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[85b622f7-8cb6-4d85-8896-4bf1f26c576a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:4563'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372337, 'tstamp': 372337}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225533, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.867 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a09b858-131b-471f-b228-e326d7e007eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f4c4888-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:45:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372337, 'reachable_time': 25070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225534, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.895 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c11d78-cf57-48a4-b9bb-4273a582c580]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.952 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1503caec-a647-4de4-8da2-2a19f8ff0720]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.954 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f4c4888-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.955 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.955 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f4c4888-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:42 np0005546909 NetworkManager[55691]: <info>  [1764936342.9582] manager: (tap0f4c4888-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.957 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:42 np0005546909 kernel: tap0f4c4888-40: entered promiscuous mode
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.961 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0f4c4888-40, col_values=(('external_ids', {'iface-id': 'b2e28c8a-557d-459b-807e-dd1f5be0a608'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.962 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:42Z|00448|binding|INFO|Releasing lport b2e28c8a-557d-459b-807e-dd1f5be0a608 from this chassis (sb_readonly=0)
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.964 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.965 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0f4c4888-4b32-4259-8441-31af091e0c7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0f4c4888-4b32-4259-8441-31af091e0c7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.975 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[39249698-c960-4310-8ea9-160772819d6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.976 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-0f4c4888-4b32-4259-8441-31af091e0c7d
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/0f4c4888-4b32-4259-8441-31af091e0c7d.pid.haproxy
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 0f4c4888-4b32-4259-8441-31af091e0c7d
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:05:42 np0005546909 nova_compute[187208]: 2025-12-05 12:05:42.977 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:42.979 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'env', 'PROCESS_TAG=haproxy-0f4c4888-4b32-4259-8441-31af091e0c7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0f4c4888-4b32-4259-8441-31af091e0c7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.102 187212 DEBUG nova.network.neutron [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.192 187212 DEBUG nova.compute.manager [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received event network-changed-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.192 187212 DEBUG nova.compute.manager [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Refreshing instance network info cache due to event network-changed-c5cb68aa-e5c2-48b0-b9c4-e0542120e065. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.192 187212 DEBUG oslo_concurrency.lockutils [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-8888dd78-1c78-4065-8536-9a1096bdf57b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.261 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 24358eea-14fb-4863-a6c4-aadcdb495f54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.261 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 472c7e2c-bdad-4230-904b-6937ceb872d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.262 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.262 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 8888dd78-1c78-4065-8536-9a1096bdf57b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.272 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936343.2724597, 472c7e2c-bdad-4230-904b-6937ceb872d2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.273 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] VM Started (Lifecycle Event)#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.289 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance b81bb939-d14f-4a72-b7fe-95fc5d8810a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.289 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.290 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.297 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.302 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936343.2733757, 472c7e2c-bdad-4230-904b-6937ceb872d2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.302 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.324 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:43 np0005546909 podman[225573]: 2025-12-05 12:05:43.329834134 +0000 UTC m=+0.048125854 container create 96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.333 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.333 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.339 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.363 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.364 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:05:43 np0005546909 systemd[1]: Started libpod-conmon-96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302.scope.
Dec  5 07:05:43 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:05:43 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/346d0c910feecbfb16f9369239d1a7161a45d173e7171bca2bd39f251f209cca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:05:43 np0005546909 podman[225573]: 2025-12-05 12:05:43.307111761 +0000 UTC m=+0.025403521 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:05:43 np0005546909 podman[225573]: 2025-12-05 12:05:43.414979542 +0000 UTC m=+0.133271302 container init 96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:05:43 np0005546909 podman[225573]: 2025-12-05 12:05:43.419882313 +0000 UTC m=+0.138174033 container start 96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.431 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:43 np0005546909 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [NOTICE]   (225592) : New worker (225594) forked
Dec  5 07:05:43 np0005546909 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [NOTICE]   (225592) : Loading success.
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.443 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.448 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.473 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.474 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.474 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.480 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.480 187212 INFO nova.compute.claims [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.608 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.609 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.609 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.610 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.610 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Processing event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.610 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.611 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.611 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.611 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.612 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.612 187212 WARNING nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.612 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-changed-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.612 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Refreshing instance network info cache due to event network-changed-549318e9-e629-4e2c-8cbb-3cd263c2bc34. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.613 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.616 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.621 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936343.6204307, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.621 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.624 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.628 187212 INFO nova.virt.libvirt.driver [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance spawned successfully.#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.629 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.645 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.657 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.659 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.660 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.660 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.660 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.661 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.661 187212 DEBUG nova.virt.libvirt.driver [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.670 187212 DEBUG nova.compute.provider_tree [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.693 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.695 187212 DEBUG nova.scheduler.client.report [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.734 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.735 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.742 187212 INFO nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 12.07 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.743 187212 DEBUG nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.810 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.810 187212 DEBUG nova.network.neutron [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.820 187212 INFO nova.compute.manager [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 12.87 seconds to build instance.#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.835 187212 INFO nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.840 187212 DEBUG oslo_concurrency.lockutils [None req-b8ce08d9-9bc0-46eb-8925-12535412eb71 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.999s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.852 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.950 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.952 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.952 187212 INFO nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Creating image(s)#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.953 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "/var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.953 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.954 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:43 np0005546909 nova_compute[187208]: 2025-12-05 12:05:43.971 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.036 187212 DEBUG nova.network.neutron [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Updating instance_info_cache with network_info: [{"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.041 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.042 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.043 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.055 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.076 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Releasing lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.077 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance network_info: |[{"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.078 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.079 187212 DEBUG nova.network.neutron [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Refreshing network info cache for port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.082 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Start _get_guest_xml network_info=[{"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.086 187212 WARNING nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.091 187212 DEBUG nova.virt.libvirt.host [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.092 187212 DEBUG nova.virt.libvirt.host [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.100 187212 DEBUG nova.virt.libvirt.host [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.101 187212 DEBUG nova.virt.libvirt.host [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.102 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.102 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.103 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.103 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.103 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.103 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.104 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.104 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.104 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.105 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.105 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.105 187212 DEBUG nova.virt.hardware [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.109 187212 DEBUG nova.virt.libvirt.vif [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1365452817',display_name='tempest-ListServerFiltersTestJSON-instance-1365452817',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1365452817',id=56,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-6r1u1q6j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:35Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.109 187212 DEBUG nova.network.os_vif_util [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.110 187212 DEBUG nova.network.os_vif_util [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.111 187212 DEBUG nova.objects.instance [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'pci_devices' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.115 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.116 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.136 187212 DEBUG nova.policy [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.141 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  <uuid>cbcd4733-8c53-4696-9bc0-6e5c516c9dcf</uuid>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  <name>instance-00000038</name>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1365452817</nova:name>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:05:44</nova:creationTime>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:        <nova:user uuid="4f8149b8192e411a9131b103b25862b6">tempest-ListServerFiltersTestJSON-711798252-project-member</nova:user>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:        <nova:project uuid="e8f613c8797e432d96e43223fb7c476d">tempest-ListServerFiltersTestJSON-711798252</nova:project>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:        <nova:port uuid="549318e9-e629-4e2c-8cbb-3cd263c2bc34">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <entry name="serial">cbcd4733-8c53-4696-9bc0-6e5c516c9dcf</entry>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <entry name="uuid">cbcd4733-8c53-4696-9bc0-6e5c516c9dcf</entry>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:9b:d7:ed"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <target dev="tap549318e9-e6"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/console.log" append="off"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:05:44 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:05:44 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:05:44 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:05:44 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.142 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Preparing to wait for external event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.143 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.143 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.143 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.144 187212 DEBUG nova.virt.libvirt.vif [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1365452817',display_name='tempest-ListServerFiltersTestJSON-instance-1365452817',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1365452817',id=56,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-6r1u1q6j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:35Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.144 187212 DEBUG nova.network.os_vif_util [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.145 187212 DEBUG nova.network.os_vif_util [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.145 187212 DEBUG os_vif [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.146 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.146 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.146 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.149 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.149 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap549318e9-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.149 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap549318e9-e6, col_values=(('external_ids', {'iface-id': '549318e9-e629-4e2c-8cbb-3cd263c2bc34', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:d7:ed', 'vm-uuid': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.151 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.152 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:44 np0005546909 NetworkManager[55691]: <info>  [1764936344.1524] manager: (tap549318e9-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.152 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.170 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.174 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.175 187212 INFO os_vif [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6')#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.216 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.217 187212 DEBUG nova.virt.disk.api [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Checking if we can resize image /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.217 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.263 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.264 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.264 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No VIF found with MAC fa:16:3e:9b:d7:ed, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.265 187212 INFO nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Using config drive#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.291 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.292 187212 DEBUG nova.virt.disk.api [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Cannot resize image /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.293 187212 DEBUG nova.objects.instance [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'migration_context' on Instance uuid b81bb939-d14f-4a72-b7fe-95fc5d8810a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.306 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.306 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Ensure instance console log exists: /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.306 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.307 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.307 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.475 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.732 187212 DEBUG nova.network.neutron [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updated VIF entry in instance network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.733 187212 DEBUG nova.network.neutron [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.749 187212 DEBUG oslo_concurrency.lockutils [req-e0f6b1ee-d360-44e3-adbf-2abd84f975af req-523c4e75-ed51-4cde-b289-434198759f0f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.963 187212 INFO nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Creating config drive at /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config#033[00m
Dec  5 07:05:44 np0005546909 nova_compute[187208]: 2025-12-05 12:05:44.967 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgmbmcyo8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.094 187212 DEBUG oslo_concurrency.processutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgmbmcyo8" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:45 np0005546909 NetworkManager[55691]: <info>  [1764936345.1535] manager: (tap549318e9-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Dec  5 07:05:45 np0005546909 kernel: tap549318e9-e6: entered promiscuous mode
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.156 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:45 np0005546909 NetworkManager[55691]: <info>  [1764936345.1690] device (tap549318e9-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:05:45 np0005546909 NetworkManager[55691]: <info>  [1764936345.1701] device (tap549318e9-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:05:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:45Z|00449|binding|INFO|Claiming lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 for this chassis.
Dec  5 07:05:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:45Z|00450|binding|INFO|549318e9-e629-4e2c-8cbb-3cd263c2bc34: Claiming fa:16:3e:9b:d7:ed 10.100.0.9
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.173 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:d7:ed 10.100.0.9'], port_security=['fa:16:3e:9b:d7:ed 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=549318e9-e629-4e2c-8cbb-3cd263c2bc34) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.174 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 bound to our chassis#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.176 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.195 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[550a2b16-f310-4e90-9e26-abd43a4f4f54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.196 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4a2d11fe-a1 in ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.198 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4a2d11fe-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.198 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fa92e7be-9178-408b-92db-7f19fa404d78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.200 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[92a7bb49-8d73-47a7-a0db-be6e4ab1e32b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.207 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:45Z|00451|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 ovn-installed in OVS
Dec  5 07:05:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:45Z|00452|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 up in Southbound
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.214 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[1214fad4-df87-4eb1-a2e7-21c703e59a70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 systemd-machined[153543]: New machine qemu-60-instance-00000038.
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.228 187212 DEBUG nova.network.neutron [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Updating instance_info_cache with network_info: [{"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:45 np0005546909 systemd[1]: Started Virtual Machine qemu-60-instance-00000038.
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.235 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4ed83f89-63b9-444e-b3a3-c48c210e59d6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.252 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Releasing lock "refresh_cache-8888dd78-1c78-4065-8536-9a1096bdf57b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.252 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Instance network_info: |[{"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.253 187212 DEBUG oslo_concurrency.lockutils [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-8888dd78-1c78-4065-8536-9a1096bdf57b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.253 187212 DEBUG nova.network.neutron [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Refreshing network info cache for port c5cb68aa-e5c2-48b0-b9c4-e0542120e065 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.258 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Start _get_guest_xml network_info=[{"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': '6e277715-617f-4e35-89c7-208beae9fd5c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.267 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a9fecbcc-0c07-49dd-9af0-1eaeae99f6ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.273 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3d1bef-860f-4a91-9046-11cdb6dae9a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 NetworkManager[55691]: <info>  [1764936345.2774] manager: (tap4a2d11fe-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/189)
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.280 187212 WARNING nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.291 187212 DEBUG nova.virt.libvirt.host [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.292 187212 DEBUG nova.virt.libvirt.host [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.296 187212 DEBUG nova.virt.libvirt.host [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.297 187212 DEBUG nova.virt.libvirt.host [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.297 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.298 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.298 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.298 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.298 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.299 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.299 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.299 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.299 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.299 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.300 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.300 187212 DEBUG nova.virt.hardware [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.304 187212 DEBUG nova.virt.libvirt.vif [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-2001854085',display_name='tempest-ListServerFiltersTestJSON-instance-2001854085',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-2001854085',id=57,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-ubyu8olf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:38Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=8888dd78-1c78-4065-8536-9a1096bdf57b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.305 187212 DEBUG nova.network.os_vif_util [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.306 187212 DEBUG nova.network.os_vif_util [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.306 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca72d05-486a-4969-8f51-78aada80563d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.307 187212 DEBUG nova.objects.instance [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'pci_devices' on Instance uuid 8888dd78-1c78-4065-8536-9a1096bdf57b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.309 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef962ea-f7b9-459a-9d36-ba8732dcda33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.322 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  <uuid>8888dd78-1c78-4065-8536-9a1096bdf57b</uuid>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  <name>instance-00000039</name>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-2001854085</nova:name>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:05:45</nova:creationTime>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:        <nova:user uuid="4f8149b8192e411a9131b103b25862b6">tempest-ListServerFiltersTestJSON-711798252-project-member</nova:user>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:        <nova:project uuid="e8f613c8797e432d96e43223fb7c476d">tempest-ListServerFiltersTestJSON-711798252</nova:project>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:        <nova:port uuid="c5cb68aa-e5c2-48b0-b9c4-e0542120e065">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <entry name="serial">8888dd78-1c78-4065-8536-9a1096bdf57b</entry>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <entry name="uuid">8888dd78-1c78-4065-8536-9a1096bdf57b</entry>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.config"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:8a:a8:16"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <target dev="tapc5cb68aa-e5"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/console.log" append="off"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:05:45 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:05:45 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:05:45 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:05:45 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.324 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Preparing to wait for external event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.324 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.325 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.325 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.326 187212 DEBUG nova.virt.libvirt.vif [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-2001854085',display_name='tempest-ListServerFiltersTestJSON-instance-2001854085',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-2001854085',id=57,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-ubyu8olf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:38Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=8888dd78-1c78-4065-8536-9a1096bdf57b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.326 187212 DEBUG nova.network.os_vif_util [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.327 187212 DEBUG nova.network.os_vif_util [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.327 187212 DEBUG os_vif [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.328 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.328 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.328 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.331 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.331 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5cb68aa-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.332 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc5cb68aa-e5, col_values=(('external_ids', {'iface-id': 'c5cb68aa-e5c2-48b0-b9c4-e0542120e065', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:a8:16', 'vm-uuid': '8888dd78-1c78-4065-8536-9a1096bdf57b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.333 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:45 np0005546909 NetworkManager[55691]: <info>  [1764936345.3346] manager: (tapc5cb68aa-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.336 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:05:45 np0005546909 NetworkManager[55691]: <info>  [1764936345.3410] device (tap4a2d11fe-a0): carrier: link connected
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.343 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.344 187212 INFO os_vif [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5')#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.347 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7e51955f-08ae-4ba1-8fde-97da5301b660]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.379 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a5267b2a-ca4d-4c8c-8a6d-a6d0603526d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225658, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.396 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c7cb2f57-127a-4912-b6ef-780c0398530b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:9456'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372590, 'tstamp': 372590}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225659, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.421 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43ce2aed-a120-403f-89af-2375816f71ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225660, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.448 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1486da8e-4c44-48df-8d0e-2bdd1bf96624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.495 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9e54bdfa-965f-4cb4-92c9-e158a6105f70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.496 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.496 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.497 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:45 np0005546909 NetworkManager[55691]: <info>  [1764936345.4996] manager: (tap4a2d11fe-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.499 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:45 np0005546909 kernel: tap4a2d11fe-a0: entered promiscuous mode
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.503 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.504 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.505 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:45Z|00453|binding|INFO|Releasing lport 27f6a3c0-dd69-4255-8d00-850605f3016e from this chassis (sb_readonly=0)
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.519 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.521 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.522 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4a2d11fe-a91d-4cf5-bde7-283f0aa52f63.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4a2d11fe-a91d-4cf5-bde7-283f0aa52f63.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.523 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c3fdf038-604b-47c2-ba2f-cc52814316c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.523 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/4a2d11fe-a91d-4cf5-bde7-283f0aa52f63.pid.haproxy
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:05:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:45.525 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'env', 'PROCESS_TAG=haproxy-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4a2d11fe-a91d-4cf5-bde7-283f0aa52f63.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.608 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.609 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.609 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No VIF found with MAC fa:16:3e:8a:a8:16, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.610 187212 INFO nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Using config drive#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.647 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936345.6473682, cbcd4733-8c53-4696-9bc0-6e5c516c9dcf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:45 np0005546909 nova_compute[187208]: 2025-12-05 12:05:45.648 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] VM Started (Lifecycle Event)#033[00m
Dec  5 07:05:45 np0005546909 podman[225704]: 2025-12-05 12:05:45.878788603 +0000 UTC m=+0.047584219 container create 912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:05:45 np0005546909 systemd[1]: Started libpod-conmon-912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b.scope.
Dec  5 07:05:45 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:05:45 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0675ec35d20a48a0477b2dc90980940bf1de49a39a39196259a264090cabf69/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:05:45 np0005546909 podman[225704]: 2025-12-05 12:05:45.853241718 +0000 UTC m=+0.022037354 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:05:45 np0005546909 podman[225704]: 2025-12-05 12:05:45.950846364 +0000 UTC m=+0.119641990 container init 912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec  5 07:05:45 np0005546909 podman[225704]: 2025-12-05 12:05:45.955655383 +0000 UTC m=+0.124450999 container start 912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:05:45 np0005546909 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [NOTICE]   (225723) : New worker (225725) forked
Dec  5 07:05:45 np0005546909 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [NOTICE]   (225723) : Loading success.
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.099 187212 DEBUG nova.compute.manager [req-f16d2488-c61f-4e3d-8bcb-b336ca312a80 req-4c27c47b-29e9-4d22-bd67-d0455293cc8c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.100 187212 DEBUG oslo_concurrency.lockutils [req-f16d2488-c61f-4e3d-8bcb-b336ca312a80 req-4c27c47b-29e9-4d22-bd67-d0455293cc8c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.100 187212 DEBUG oslo_concurrency.lockutils [req-f16d2488-c61f-4e3d-8bcb-b336ca312a80 req-4c27c47b-29e9-4d22-bd67-d0455293cc8c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.100 187212 DEBUG oslo_concurrency.lockutils [req-f16d2488-c61f-4e3d-8bcb-b336ca312a80 req-4c27c47b-29e9-4d22-bd67-d0455293cc8c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.100 187212 DEBUG nova.compute.manager [req-f16d2488-c61f-4e3d-8bcb-b336ca312a80 req-4c27c47b-29e9-4d22-bd67-d0455293cc8c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Processing event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.101 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.106 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.109 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.112 187212 INFO nova.virt.libvirt.driver [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Instance spawned successfully.#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.113 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936345.647611, cbcd4733-8c53-4696-9bc0-6e5c516c9dcf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.113 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.114 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.144 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.148 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.198 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.198 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936346.1057503, 472c7e2c-bdad-4230-904b-6937ceb872d2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.198 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.206 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.208 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.209 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.209 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.209 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.210 187212 DEBUG nova.virt.libvirt.driver [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.390 187212 DEBUG nova.compute.manager [req-6a17f39e-824e-419b-aaf1-c1651a594f0b req-b9671eee-62e5-4f01-9d52-9f4b8aa7d079 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.391 187212 DEBUG oslo_concurrency.lockutils [req-6a17f39e-824e-419b-aaf1-c1651a594f0b req-b9671eee-62e5-4f01-9d52-9f4b8aa7d079 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.391 187212 DEBUG oslo_concurrency.lockutils [req-6a17f39e-824e-419b-aaf1-c1651a594f0b req-b9671eee-62e5-4f01-9d52-9f4b8aa7d079 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.391 187212 DEBUG oslo_concurrency.lockutils [req-6a17f39e-824e-419b-aaf1-c1651a594f0b req-b9671eee-62e5-4f01-9d52-9f4b8aa7d079 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.392 187212 DEBUG nova.compute.manager [req-6a17f39e-824e-419b-aaf1-c1651a594f0b req-b9671eee-62e5-4f01-9d52-9f4b8aa7d079 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Processing event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.393 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.412 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.419 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.423 187212 INFO nova.virt.libvirt.driver [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance spawned successfully.#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.424 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.427 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.519 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.520 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.521 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.521 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.522 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.523 187212 DEBUG nova.virt.libvirt.driver [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.529 187212 INFO nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Took 13.12 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.529 187212 DEBUG nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.533 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.533 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936346.3967624, cbcd4733-8c53-4696-9bc0-6e5c516c9dcf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.533 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.547 187212 DEBUG nova.network.neutron [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Successfully created port: 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.657 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.661 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.908 187212 INFO nova.compute.manager [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Took 13.80 seconds to build instance.#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.960 187212 INFO nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Took 11.23 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.960 187212 DEBUG nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:46 np0005546909 nova_compute[187208]: 2025-12-05 12:05:46.961 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.134 187212 INFO nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Creating config drive at /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.config#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.140 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo7qit4xa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.258 187212 DEBUG oslo_concurrency.lockutils [None req-62aa4af6-f395-40a9-8cf5-25b9941dad05 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.268 187212 DEBUG oslo_concurrency.processutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo7qit4xa" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.326 187212 INFO nova.compute.manager [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Took 12.12 seconds to build instance.#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.340 187212 DEBUG nova.network.neutron [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Updated VIF entry in instance network info cache for port 549318e9-e629-4e2c-8cbb-3cd263c2bc34. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.340 187212 DEBUG nova.network.neutron [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Updating instance_info_cache with network_info: [{"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:47 np0005546909 NetworkManager[55691]: <info>  [1764936347.3599] manager: (tapc5cb68aa-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/192)
Dec  5 07:05:47 np0005546909 kernel: tapc5cb68aa-e5: entered promiscuous mode
Dec  5 07:05:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:47Z|00454|binding|INFO|Claiming lport c5cb68aa-e5c2-48b0-b9c4-e0542120e065 for this chassis.
Dec  5 07:05:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:47Z|00455|binding|INFO|c5cb68aa-e5c2-48b0-b9c4-e0542120e065: Claiming fa:16:3e:8a:a8:16 10.100.0.13
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.366 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.386 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:47Z|00456|binding|INFO|Setting lport c5cb68aa-e5c2-48b0-b9c4-e0542120e065 ovn-installed in OVS
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.400 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:47 np0005546909 systemd-udevd[225753]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:05:47 np0005546909 systemd-machined[153543]: New machine qemu-61-instance-00000039.
Dec  5 07:05:47 np0005546909 NetworkManager[55691]: <info>  [1764936347.4188] device (tapc5cb68aa-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:05:47 np0005546909 NetworkManager[55691]: <info>  [1764936347.4196] device (tapc5cb68aa-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:05:47 np0005546909 systemd[1]: Started Virtual Machine qemu-61-instance-00000039.
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.434 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:a8:16 10.100.0.13'], port_security=['fa:16:3e:8a:a8:16 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c5cb68aa-e5c2-48b0-b9c4-e0542120e065) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.435 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c5cb68aa-e5c2-48b0-b9c4-e0542120e065 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 bound to our chassis#033[00m
Dec  5 07:05:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:47Z|00457|binding|INFO|Setting lport c5cb68aa-e5c2-48b0-b9c4-e0542120e065 up in Southbound
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.437 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63#033[00m
Dec  5 07:05:47 np0005546909 NetworkManager[55691]: <info>  [1764936347.4479] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.447 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:47 np0005546909 NetworkManager[55691]: <info>  [1764936347.4487] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.452 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.453 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.453 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.453 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b26359f0-bb73-4963-b318-b296cc54f55e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.453 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.454 187212 DEBUG oslo_concurrency.lockutils [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.454 187212 DEBUG nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] No waiting events found dispatching network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.455 187212 WARNING nova.compute.manager [req-7dcb0343-83f1-45a2-a419-2fdb54b6cf87 req-bf00b6ae-b0ee-4c2c-a127-a38125b26d4e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received unexpected event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.456 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.484 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[67a22f9b-5a4e-4a90-ac61-69de9cd64684]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.487 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a071a209-75fa-4fd9-85de-86fc5a693deb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.512 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4128689e-8845-4dcc-911a-f7001b6e3779]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.532 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd9cf6d-f351-4cea-9d3b-8701d1965570]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225768, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.547 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e371ca39-2be8-478f-a1ee-1ae6880baa2f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225769, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225769, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.549 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.551 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.553 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.553 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.554 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:47.554 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.606 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.626 187212 DEBUG oslo_concurrency.lockutils [None req-61e87476-59e6-4365-993b-5f6a528af143 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.509s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:47Z|00458|binding|INFO|Releasing lport b2e28c8a-557d-459b-807e-dd1f5be0a608 from this chassis (sb_readonly=0)
Dec  5 07:05:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:47Z|00459|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec  5 07:05:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:47Z|00460|binding|INFO|Releasing lport 27f6a3c0-dd69-4255-8d00-850605f3016e from this chassis (sb_readonly=0)
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.749 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936347.7358027, 8888dd78-1c78-4065-8536-9a1096bdf57b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.749 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] VM Started (Lifecycle Event)#033[00m
Dec  5 07:05:47 np0005546909 nova_compute[187208]: 2025-12-05 12:05:47.829 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.050 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.055 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936347.7361302, 8888dd78-1c78-4065-8536-9a1096bdf57b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.056 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.093 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.097 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.110 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.171 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.505 187212 DEBUG nova.network.neutron [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Updated VIF entry in instance network info cache for port c5cb68aa-e5c2-48b0-b9c4-e0542120e065. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.506 187212 DEBUG nova.network.neutron [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Updating instance_info_cache with network_info: [{"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.532 187212 DEBUG oslo_concurrency.lockutils [req-eff3c99a-e519-4f33-bccb-4cbcb7fb3863 req-deb689f2-f3a8-4a84-b976-3b33ecedc5fc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-8888dd78-1c78-4065-8536-9a1096bdf57b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.727 187212 DEBUG nova.compute.manager [req-2fcfe6a9-cd67-4782-b34f-e4af8b80b09b req-1c23b12b-ca37-4800-82f1-223e4df2d6ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.728 187212 DEBUG oslo_concurrency.lockutils [req-2fcfe6a9-cd67-4782-b34f-e4af8b80b09b req-1c23b12b-ca37-4800-82f1-223e4df2d6ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.729 187212 DEBUG oslo_concurrency.lockutils [req-2fcfe6a9-cd67-4782-b34f-e4af8b80b09b req-1c23b12b-ca37-4800-82f1-223e4df2d6ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.729 187212 DEBUG oslo_concurrency.lockutils [req-2fcfe6a9-cd67-4782-b34f-e4af8b80b09b req-1c23b12b-ca37-4800-82f1-223e4df2d6ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.729 187212 DEBUG nova.compute.manager [req-2fcfe6a9-cd67-4782-b34f-e4af8b80b09b req-1c23b12b-ca37-4800-82f1-223e4df2d6ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:05:48 np0005546909 nova_compute[187208]: 2025-12-05 12:05:48.730 187212 WARNING nova.compute.manager [req-2fcfe6a9-cd67-4782-b34f-e4af8b80b09b req-1c23b12b-ca37-4800-82f1-223e4df2d6ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:05:49 np0005546909 podman[225777]: 2025-12-05 12:05:49.258004159 +0000 UTC m=+0.106285906 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm)
Dec  5 07:05:49 np0005546909 nova_compute[187208]: 2025-12-05 12:05:49.421 187212 DEBUG nova.compute.manager [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-changed-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:49 np0005546909 nova_compute[187208]: 2025-12-05 12:05:49.423 187212 DEBUG nova.compute.manager [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Refreshing instance network info cache due to event network-changed-2e9efd6c-740c-405b-b9f0-bd46434070a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:05:49 np0005546909 nova_compute[187208]: 2025-12-05 12:05:49.423 187212 DEBUG oslo_concurrency.lockutils [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:49 np0005546909 nova_compute[187208]: 2025-12-05 12:05:49.423 187212 DEBUG oslo_concurrency.lockutils [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:49 np0005546909 nova_compute[187208]: 2025-12-05 12:05:49.424 187212 DEBUG nova.network.neutron [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Refreshing network info cache for port 2e9efd6c-740c-405b-b9f0-bd46434070a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:05:50 np0005546909 nova_compute[187208]: 2025-12-05 12:05:50.264 187212 DEBUG nova.network.neutron [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Successfully updated port: 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:05:50 np0005546909 nova_compute[187208]: 2025-12-05 12:05:50.333 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "refresh_cache-b81bb939-d14f-4a72-b7fe-95fc5d8810a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:50 np0005546909 nova_compute[187208]: 2025-12-05 12:05:50.334 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquired lock "refresh_cache-b81bb939-d14f-4a72-b7fe-95fc5d8810a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:50 np0005546909 nova_compute[187208]: 2025-12-05 12:05:50.334 187212 DEBUG nova.network.neutron [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:05:50 np0005546909 nova_compute[187208]: 2025-12-05 12:05:50.336 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:50 np0005546909 nova_compute[187208]: 2025-12-05 12:05:50.657 187212 DEBUG nova.network.neutron [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.270 187212 DEBUG nova.compute.manager [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received event network-changed-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.271 187212 DEBUG nova.compute.manager [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Refreshing instance network info cache due to event network-changed-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.272 187212 DEBUG oslo_concurrency.lockutils [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-b81bb939-d14f-4a72-b7fe-95fc5d8810a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.527 187212 DEBUG nova.compute.manager [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.528 187212 DEBUG oslo_concurrency.lockutils [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.529 187212 DEBUG oslo_concurrency.lockutils [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.530 187212 DEBUG oslo_concurrency.lockutils [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.531 187212 DEBUG nova.compute.manager [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Processing event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.532 187212 DEBUG nova.compute.manager [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.533 187212 DEBUG oslo_concurrency.lockutils [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.533 187212 DEBUG oslo_concurrency.lockutils [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.534 187212 DEBUG oslo_concurrency.lockutils [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.535 187212 DEBUG nova.compute.manager [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] No waiting events found dispatching network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.536 187212 WARNING nova.compute.manager [req-28006345-53a7-4a52-bcdc-3872db13a72a req-6e6758b9-c0a8-45fa-8574-74aaecc2f96c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received unexpected event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.537 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.555 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936351.5424495, 8888dd78-1c78-4065-8536-9a1096bdf57b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.558 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.563 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.578 187212 INFO nova.virt.libvirt.driver [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Instance spawned successfully.#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.579 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.583 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.587 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.601 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.602 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.603 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.603 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.604 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.605 187212 DEBUG nova.virt.libvirt.driver [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.611 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.702 187212 INFO nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Took 12.99 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.702 187212 DEBUG nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.703 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.780 187212 INFO nova.compute.manager [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Took 13.52 seconds to build instance.#033[00m
Dec  5 07:05:51 np0005546909 nova_compute[187208]: 2025-12-05 12:05:51.802 187212 DEBUG oslo_concurrency.lockutils [None req-2846e67b-eebc-4941-9af3-ec1096ccaba5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.177 187212 DEBUG nova.network.neutron [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updated VIF entry in instance network info cache for port 2e9efd6c-740c-405b-b9f0-bd46434070a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.178 187212 DEBUG nova.network.neutron [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.199 187212 DEBUG oslo_concurrency.lockutils [req-60fc4812-bc90-4ec8-b980-492824b9c53f req-d06c9b18-45cb-428c-8932-8c1cfbb5436f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.346 187212 DEBUG nova.network.neutron [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Updating instance_info_cache with network_info: [{"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.369 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Releasing lock "refresh_cache-b81bb939-d14f-4a72-b7fe-95fc5d8810a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.370 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Instance network_info: |[{"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.372 187212 DEBUG oslo_concurrency.lockutils [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-b81bb939-d14f-4a72-b7fe-95fc5d8810a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.372 187212 DEBUG nova.network.neutron [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Refreshing network info cache for port 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.376 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Start _get_guest_xml network_info=[{"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.381 187212 WARNING nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.389 187212 DEBUG nova.virt.libvirt.host [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.390 187212 DEBUG nova.virt.libvirt.host [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.393 187212 DEBUG nova.virt.libvirt.host [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.394 187212 DEBUG nova.virt.libvirt.host [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.395 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.395 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='09233d41-3279-4f39-ac6e-a21662b4f176',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.396 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.397 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.397 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.398 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.398 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.399 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.399 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.400 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.400 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.400 187212 DEBUG nova.virt.hardware [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.405 187212 DEBUG nova.virt.libvirt.vif [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1462907521',display_name='tempest-ListServerFiltersTestJSON-instance-1462907521',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1462907521',id=58,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-bzpoia2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:43Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=b81bb939-d14f-4a72-b7fe-95fc5d8810a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.406 187212 DEBUG nova.network.os_vif_util [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.407 187212 DEBUG nova.network.os_vif_util [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.408 187212 DEBUG nova.objects.instance [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'pci_devices' on Instance uuid b81bb939-d14f-4a72-b7fe-95fc5d8810a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.422 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  <uuid>b81bb939-d14f-4a72-b7fe-95fc5d8810a1</uuid>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  <name>instance-0000003a</name>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  <memory>196608</memory>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1462907521</nova:name>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:05:52</nova:creationTime>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.micro">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:        <nova:memory>192</nova:memory>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:        <nova:user uuid="4f8149b8192e411a9131b103b25862b6">tempest-ListServerFiltersTestJSON-711798252-project-member</nova:user>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:        <nova:project uuid="e8f613c8797e432d96e43223fb7c476d">tempest-ListServerFiltersTestJSON-711798252</nova:project>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:        <nova:port uuid="5683f8a8-691c-43f3-a88f-eb0c30ccb3c5">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <entry name="serial">b81bb939-d14f-4a72-b7fe-95fc5d8810a1</entry>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <entry name="uuid">b81bb939-d14f-4a72-b7fe-95fc5d8810a1</entry>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.config"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:d3:3c:38"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <target dev="tap5683f8a8-69"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/console.log" append="off"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:05:52 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:05:52 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:05:52 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:05:52 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.431 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Preparing to wait for external event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.431 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.432 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.432 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.433 187212 DEBUG nova.virt.libvirt.vif [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1462907521',display_name='tempest-ListServerFiltersTestJSON-instance-1462907521',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1462907521',id=58,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-bzpoia2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:43Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=b81bb939-d14f-4a72-b7fe-95fc5d8810a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.433 187212 DEBUG nova.network.os_vif_util [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.434 187212 DEBUG nova.network.os_vif_util [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.436 187212 DEBUG os_vif [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.437 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.438 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.438 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.446 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.446 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5683f8a8-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.446 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5683f8a8-69, col_values=(('external_ids', {'iface-id': '5683f8a8-691c-43f3-a88f-eb0c30ccb3c5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:3c:38', 'vm-uuid': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.484 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:52 np0005546909 NetworkManager[55691]: <info>  [1764936352.4868] manager: (tap5683f8a8-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.489 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.494 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.496 187212 INFO os_vif [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69')#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.573 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.573 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.574 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] No VIF found with MAC fa:16:3e:d3:3c:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.574 187212 INFO nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Using config drive#033[00m
Dec  5 07:05:52 np0005546909 nova_compute[187208]: 2025-12-05 12:05:52.610 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:53 np0005546909 nova_compute[187208]: 2025-12-05 12:05:53.564 187212 INFO nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Creating config drive at /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.config#033[00m
Dec  5 07:05:53 np0005546909 nova_compute[187208]: 2025-12-05 12:05:53.569 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpojzq8yb5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:53 np0005546909 nova_compute[187208]: 2025-12-05 12:05:53.697 187212 DEBUG oslo_concurrency.processutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpojzq8yb5" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:53 np0005546909 NetworkManager[55691]: <info>  [1764936353.7598] manager: (tap5683f8a8-69): new Tun device (/org/freedesktop/NetworkManager/Devices/196)
Dec  5 07:05:53 np0005546909 kernel: tap5683f8a8-69: entered promiscuous mode
Dec  5 07:05:53 np0005546909 nova_compute[187208]: 2025-12-05 12:05:53.779 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:53 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:53Z|00461|binding|INFO|Claiming lport 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 for this chassis.
Dec  5 07:05:53 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:53Z|00462|binding|INFO|5683f8a8-691c-43f3-a88f-eb0c30ccb3c5: Claiming fa:16:3e:d3:3c:38 10.100.0.11
Dec  5 07:05:53 np0005546909 nova_compute[187208]: 2025-12-05 12:05:53.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:53 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:53Z|00463|binding|INFO|Setting lport 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 ovn-installed in OVS
Dec  5 07:05:53 np0005546909 systemd-udevd[225817]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:05:53 np0005546909 nova_compute[187208]: 2025-12-05 12:05:53.833 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:53 np0005546909 systemd-machined[153543]: New machine qemu-62-instance-0000003a.
Dec  5 07:05:53 np0005546909 NetworkManager[55691]: <info>  [1764936353.8556] device (tap5683f8a8-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:05:53 np0005546909 systemd[1]: Started Virtual Machine qemu-62-instance-0000003a.
Dec  5 07:05:53 np0005546909 NetworkManager[55691]: <info>  [1764936353.8568] device (tap5683f8a8-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.030 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:3c:38 10.100.0.11'], port_security=['fa:16:3e:d3:3c:38 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.032 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 bound to our chassis#033[00m
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.034 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63#033[00m
Dec  5 07:05:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:54Z|00464|binding|INFO|Setting lport 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 up in Southbound
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.057 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9919ce7e-053e-446f-aaaf-65126e43c3a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.092 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[657ae797-c2a1-40b8-801e-a4a953664df9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.096 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8813d7f2-8d5f-4cb4-a82e-2fffa0f5c38e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.132 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bbcbc48e-57f0-498c-833c-b984c7e1fd90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:54 np0005546909 nova_compute[187208]: 2025-12-05 12:05:54.156 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936354.1561198, b81bb939-d14f-4a72-b7fe-95fc5d8810a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:54 np0005546909 nova_compute[187208]: 2025-12-05 12:05:54.157 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] VM Started (Lifecycle Event)#033[00m
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.157 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3df95e9c-9c82-405c-a0ed-947c7a23749d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225839, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.175 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[607b2c59-304a-46ba-82e1-a7a836bc01d2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225840, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225840, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.178 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:54 np0005546909 nova_compute[187208]: 2025-12-05 12:05:54.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:54 np0005546909 nova_compute[187208]: 2025-12-05 12:05:54.181 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.183 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.183 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.184 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:05:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:05:54.184 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:05:54 np0005546909 nova_compute[187208]: 2025-12-05 12:05:54.233 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:54 np0005546909 nova_compute[187208]: 2025-12-05 12:05:54.237 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936354.1593728, b81bb939-d14f-4a72-b7fe-95fc5d8810a1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:05:54 np0005546909 nova_compute[187208]: 2025-12-05 12:05:54.238 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:05:54 np0005546909 nova_compute[187208]: 2025-12-05 12:05:54.254 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:05:54 np0005546909 nova_compute[187208]: 2025-12-05 12:05:54.257 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:05:54 np0005546909 nova_compute[187208]: 2025-12-05 12:05:54.276 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:05:55 np0005546909 nova_compute[187208]: 2025-12-05 12:05:55.229 187212 DEBUG nova.network.neutron [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Updated VIF entry in instance network info cache for port 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:05:55 np0005546909 nova_compute[187208]: 2025-12-05 12:05:55.229 187212 DEBUG nova.network.neutron [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Updating instance_info_cache with network_info: [{"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:05:55 np0005546909 nova_compute[187208]: 2025-12-05 12:05:55.257 187212 DEBUG oslo_concurrency.lockutils [req-8a05506f-c556-4cc0-ad5c-927b7871550e req-286362a2-4f54-4a40-8b13-7dc83c94bf79 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-b81bb939-d14f-4a72-b7fe-95fc5d8810a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.041 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.042 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.059 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.127 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.127 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.136 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.141 187212 INFO nova.compute.claims [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:05:56 np0005546909 podman[225855]: 2025-12-05 12:05:56.230824195 +0000 UTC m=+0.074566885 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec  5 07:05:56 np0005546909 podman[225854]: 2025-12-05 12:05:56.23379073 +0000 UTC m=+0.079884307 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.314 187212 DEBUG nova.compute.provider_tree [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.328 187212 DEBUG nova.scheduler.client.report [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.351 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.352 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.395 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.396 187212 DEBUG nova.network.neutron [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.414 187212 INFO nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.428 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.502 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:56Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ab:5e:ef 10.100.0.5
Dec  5 07:05:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:56Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:5e:ef 10.100.0.5
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.543 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.545 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.545 187212 INFO nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Creating image(s)#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.546 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "/var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.546 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "/var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.547 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "/var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.560 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.621 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.622 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.623 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.635 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.693 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.694 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.731 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.733 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.733 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.757 187212 DEBUG nova.policy [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.793 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.794 187212 DEBUG nova.virt.disk.api [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Checking if we can resize image /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.795 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.862 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.864 187212 DEBUG nova.virt.disk.api [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Cannot resize image /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.864 187212 DEBUG nova.objects.instance [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lazy-loading 'migration_context' on Instance uuid 297d72ef-6b79-45b3-813b-52b5144b522e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.879 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.880 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Ensure instance console log exists: /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.880 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.881 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.881 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:05:56 np0005546909 nova_compute[187208]: 2025-12-05 12:05:56.935 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:57 np0005546909 nova_compute[187208]: 2025-12-05 12:05:57.486 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:57 np0005546909 nova_compute[187208]: 2025-12-05 12:05:57.612 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:57 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:57Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:08:e8:08 10.100.0.14
Dec  5 07:05:57 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:57Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:08:e8:08 10.100.0.14
Dec  5 07:05:58 np0005546909 nova_compute[187208]: 2025-12-05 12:05:58.299 187212 DEBUG nova.network.neutron [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Successfully created port: 821e6243-8d28-4c8c-874c-f1e69c7d3bed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:05:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:58Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9b:d7:ed 10.100.0.9
Dec  5 07:05:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:05:58Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:d7:ed 10.100.0.9
Dec  5 07:05:58 np0005546909 nova_compute[187208]: 2025-12-05 12:05:58.724 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:05:58 np0005546909 nova_compute[187208]: 2025-12-05 12:05:58.759 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:00 np0005546909 nova_compute[187208]: 2025-12-05 12:06:00.210 187212 DEBUG nova.network.neutron [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Successfully updated port: 821e6243-8d28-4c8c-874c-f1e69c7d3bed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:06:00 np0005546909 nova_compute[187208]: 2025-12-05 12:06:00.264 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:00 np0005546909 nova_compute[187208]: 2025-12-05 12:06:00.265 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquired lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:00 np0005546909 nova_compute[187208]: 2025-12-05 12:06:00.265 187212 DEBUG nova.network.neutron [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:06:00 np0005546909 nova_compute[187208]: 2025-12-05 12:06:00.474 187212 DEBUG nova.network.neutron [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:06:00 np0005546909 nova_compute[187208]: 2025-12-05 12:06:00.647 187212 DEBUG nova.compute.manager [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-changed-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:00 np0005546909 nova_compute[187208]: 2025-12-05 12:06:00.647 187212 DEBUG nova.compute.manager [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Refreshing instance network info cache due to event network-changed-821e6243-8d28-4c8c-874c-f1e69c7d3bed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:00 np0005546909 nova_compute[187208]: 2025-12-05 12:06:00.648 187212 DEBUG oslo_concurrency.lockutils [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:01 np0005546909 podman[225936]: 2025-12-05 12:06:01.234902033 +0000 UTC m=+0.071990230 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:06:01 np0005546909 podman[225937]: 2025-12-05 12:06:01.27200003 +0000 UTC m=+0.109998164 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.623 187212 DEBUG nova.compute.manager [req-4fa4958d-d68d-4d53-aa46-5b289c7da372 req-e24b6bfd-db6a-437a-8fe3-ca067f47b891 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.624 187212 DEBUG oslo_concurrency.lockutils [req-4fa4958d-d68d-4d53-aa46-5b289c7da372 req-e24b6bfd-db6a-437a-8fe3-ca067f47b891 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.625 187212 DEBUG oslo_concurrency.lockutils [req-4fa4958d-d68d-4d53-aa46-5b289c7da372 req-e24b6bfd-db6a-437a-8fe3-ca067f47b891 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.625 187212 DEBUG oslo_concurrency.lockutils [req-4fa4958d-d68d-4d53-aa46-5b289c7da372 req-e24b6bfd-db6a-437a-8fe3-ca067f47b891 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.625 187212 DEBUG nova.compute.manager [req-4fa4958d-d68d-4d53-aa46-5b289c7da372 req-e24b6bfd-db6a-437a-8fe3-ca067f47b891 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Processing event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.626 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Instance event wait completed in 7 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.633 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936361.6320565, b81bb939-d14f-4a72-b7fe-95fc5d8810a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.633 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.636 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.641 187212 INFO nova.virt.libvirt.driver [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Instance spawned successfully.#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.642 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.657 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.663 187212 DEBUG nova.network.neutron [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updating instance_info_cache with network_info: [{"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.669 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.674 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.675 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.676 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.676 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.676 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.677 187212 DEBUG nova.virt.libvirt.driver [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.715 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.719 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Releasing lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.719 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Instance network_info: |[{"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.720 187212 DEBUG oslo_concurrency.lockutils [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.720 187212 DEBUG nova.network.neutron [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Refreshing network info cache for port 821e6243-8d28-4c8c-874c-f1e69c7d3bed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.723 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Start _get_guest_xml network_info=[{"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.727 187212 WARNING nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.733 187212 DEBUG nova.virt.libvirt.host [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.734 187212 DEBUG nova.virt.libvirt.host [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.740 187212 DEBUG nova.virt.libvirt.host [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.740 187212 DEBUG nova.virt.libvirt.host [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.741 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.741 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.741 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.742 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.742 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.742 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.742 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.743 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.743 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.743 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.743 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.744 187212 DEBUG nova.virt.hardware [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.748 187212 DEBUG nova.virt.libvirt.vif [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-2111676304',display_name='tempest-FloatingIPsAssociationTestJSON-server-2111676304',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-2111676304',id=59,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85037de7275442698e604ee3f6283cbc',ramdisk_id='',reservation_id='r-3sf4jdpp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-883508882',owner_user_name='tempest-FloatingIPsAssociationTestJSON-883508882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:56Z,user_data=None,user_id='8cf2534e7c394130b675e44ed567401b',uuid=297d72ef-6b79-45b3-813b-52b5144b522e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.748 187212 DEBUG nova.network.os_vif_util [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converting VIF {"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.749 187212 DEBUG nova.network.os_vif_util [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.750 187212 DEBUG nova.objects.instance [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lazy-loading 'pci_devices' on Instance uuid 297d72ef-6b79-45b3-813b-52b5144b522e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.764 187212 INFO nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Took 17.81 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.765 187212 DEBUG nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.770 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  <uuid>297d72ef-6b79-45b3-813b-52b5144b522e</uuid>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  <name>instance-0000003b</name>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <nova:name>tempest-FloatingIPsAssociationTestJSON-server-2111676304</nova:name>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:06:01</nova:creationTime>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:        <nova:user uuid="8cf2534e7c394130b675e44ed567401b">tempest-FloatingIPsAssociationTestJSON-883508882-project-member</nova:user>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:        <nova:project uuid="85037de7275442698e604ee3f6283cbc">tempest-FloatingIPsAssociationTestJSON-883508882</nova:project>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:        <nova:port uuid="821e6243-8d28-4c8c-874c-f1e69c7d3bed">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <entry name="serial">297d72ef-6b79-45b3-813b-52b5144b522e</entry>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <entry name="uuid">297d72ef-6b79-45b3-813b-52b5144b522e</entry>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.config"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:a6:47:26"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <target dev="tap821e6243-8d"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/console.log" append="off"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:06:01 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:06:01 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:06:01 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:06:01 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.772 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Preparing to wait for external event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.773 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.775 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.775 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.776 187212 DEBUG nova.virt.libvirt.vif [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:05:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-2111676304',display_name='tempest-FloatingIPsAssociationTestJSON-server-2111676304',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-2111676304',id=59,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='85037de7275442698e604ee3f6283cbc',ramdisk_id='',reservation_id='r-3sf4jdpp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-883508882',owner_user_name='tempest-FloatingIPsAssociationTestJSON-883508882-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:05:56Z,user_data=None,user_id='8cf2534e7c394130b675e44ed567401b',uuid=297d72ef-6b79-45b3-813b-52b5144b522e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.776 187212 DEBUG nova.network.os_vif_util [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converting VIF {"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.777 187212 DEBUG nova.network.os_vif_util [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.777 187212 DEBUG os_vif [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.778 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.779 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.779 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.785 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.785 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap821e6243-8d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.786 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap821e6243-8d, col_values=(('external_ids', {'iface-id': '821e6243-8d28-4c8c-874c-f1e69c7d3bed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a6:47:26', 'vm-uuid': '297d72ef-6b79-45b3-813b-52b5144b522e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:01 np0005546909 NetworkManager[55691]: <info>  [1764936361.7888] manager: (tap821e6243-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.788 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.791 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.797 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.799 187212 INFO os_vif [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d')#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.840 187212 INFO nova.compute.manager [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Took 18.42 seconds to build instance.#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.861 187212 DEBUG oslo_concurrency.lockutils [None req-17c87829-6128-4d78-83b2-e8547cfabc86 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.528s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.865 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.865 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.866 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] No VIF found with MAC fa:16:3e:a6:47:26, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:06:01 np0005546909 nova_compute[187208]: 2025-12-05 12:06:01.866 187212 INFO nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Using config drive#033[00m
Dec  5 07:06:02 np0005546909 nova_compute[187208]: 2025-12-05 12:06:02.613 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:02 np0005546909 nova_compute[187208]: 2025-12-05 12:06:02.632 187212 INFO nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Creating config drive at /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.config#033[00m
Dec  5 07:06:02 np0005546909 nova_compute[187208]: 2025-12-05 12:06:02.638 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpudr9a0j1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:02 np0005546909 nova_compute[187208]: 2025-12-05 12:06:02.765 187212 DEBUG oslo_concurrency.processutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpudr9a0j1" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:02 np0005546909 NetworkManager[55691]: <info>  [1764936362.8237] manager: (tap821e6243-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/198)
Dec  5 07:06:02 np0005546909 kernel: tap821e6243-8d: entered promiscuous mode
Dec  5 07:06:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:02Z|00465|binding|INFO|Claiming lport 821e6243-8d28-4c8c-874c-f1e69c7d3bed for this chassis.
Dec  5 07:06:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:02Z|00466|binding|INFO|821e6243-8d28-4c8c-874c-f1e69c7d3bed: Claiming fa:16:3e:a6:47:26 10.100.0.9
Dec  5 07:06:02 np0005546909 nova_compute[187208]: 2025-12-05 12:06:02.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.844 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:47:26 10.100.0.9'], port_security=['fa:16:3e:a6:47:26 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f4c4888-4b32-4259-8441-31af091e0c7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85037de7275442698e604ee3f6283cbc', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed3fff5f-a24a-492e-ba85-8f010d446cfc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac2e7e6b-9342-46f8-a910-5de5a261f0a9, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=821e6243-8d28-4c8c-874c-f1e69c7d3bed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.846 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 821e6243-8d28-4c8c-874c-f1e69c7d3bed in datapath 0f4c4888-4b32-4259-8441-31af091e0c7d bound to our chassis#033[00m
Dec  5 07:06:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:02Z|00467|binding|INFO|Setting lport 821e6243-8d28-4c8c-874c-f1e69c7d3bed ovn-installed in OVS
Dec  5 07:06:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:02Z|00468|binding|INFO|Setting lport 821e6243-8d28-4c8c-874c-f1e69c7d3bed up in Southbound
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.850 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0f4c4888-4b32-4259-8441-31af091e0c7d#033[00m
Dec  5 07:06:02 np0005546909 nova_compute[187208]: 2025-12-05 12:06:02.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:02 np0005546909 nova_compute[187208]: 2025-12-05 12:06:02.853 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:02 np0005546909 systemd-machined[153543]: New machine qemu-63-instance-0000003b.
Dec  5 07:06:02 np0005546909 systemd-udevd[226016]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.873 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[44d4931b-6ec8-4c67-b89b-6f4b47276039]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:02 np0005546909 NetworkManager[55691]: <info>  [1764936362.8803] device (tap821e6243-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:06:02 np0005546909 NetworkManager[55691]: <info>  [1764936362.8815] device (tap821e6243-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:06:02 np0005546909 systemd[1]: Started Virtual Machine qemu-63-instance-0000003b.
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.916 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3e451133-b566-431e-8a79-b3aefc2f652e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.920 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9d134e1f-cf08-4461-9f55-de6f482bead6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.946 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b1a2be27-8442-4baa-bf8b-bc1c34f8eb87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.963 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[adac84a7-f8a8-4931-b53b-9e9010d0e24a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f4c4888-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:45:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372337, 'reachable_time': 25070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226035, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.984 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af027261-317c-42f0-869c-606e62e07f88]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0f4c4888-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372348, 'tstamp': 372348}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226036, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0f4c4888-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372350, 'tstamp': 372350}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226036, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.987 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f4c4888-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:02 np0005546909 nova_compute[187208]: 2025-12-05 12:06:02.988 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:02 np0005546909 nova_compute[187208]: 2025-12-05 12:06:02.990 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.991 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f4c4888-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.991 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.992 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0f4c4888-40, col_values=(('external_ids', {'iface-id': 'b2e28c8a-557d-459b-807e-dd1f5be0a608'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:02.992 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:03.012 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:03.013 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:03.015 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.554 187212 DEBUG nova.network.neutron [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updated VIF entry in instance network info cache for port 821e6243-8d28-4c8c-874c-f1e69c7d3bed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.555 187212 DEBUG nova.network.neutron [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updating instance_info_cache with network_info: [{"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.578 187212 DEBUG oslo_concurrency.lockutils [req-01de7068-2fcf-410e-a120-814034a4b5af req-03bc027d-7b32-4703-8e70-d0c145c594ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.688 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936363.6872625, 297d72ef-6b79-45b3-813b-52b5144b522e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.688 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] VM Started (Lifecycle Event)#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.700 187212 DEBUG nova.compute.manager [req-ba707b70-60fd-4219-96a6-160ad429f488 req-26677a60-e60c-4795-bae5-88e08c0b7ad2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.701 187212 DEBUG oslo_concurrency.lockutils [req-ba707b70-60fd-4219-96a6-160ad429f488 req-26677a60-e60c-4795-bae5-88e08c0b7ad2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.701 187212 DEBUG oslo_concurrency.lockutils [req-ba707b70-60fd-4219-96a6-160ad429f488 req-26677a60-e60c-4795-bae5-88e08c0b7ad2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.701 187212 DEBUG oslo_concurrency.lockutils [req-ba707b70-60fd-4219-96a6-160ad429f488 req-26677a60-e60c-4795-bae5-88e08c0b7ad2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.701 187212 DEBUG nova.compute.manager [req-ba707b70-60fd-4219-96a6-160ad429f488 req-26677a60-e60c-4795-bae5-88e08c0b7ad2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] No waiting events found dispatching network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.702 187212 WARNING nova.compute.manager [req-ba707b70-60fd-4219-96a6-160ad429f488 req-26677a60-e60c-4795-bae5-88e08c0b7ad2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received unexpected event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.712 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.718 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936363.6874504, 297d72ef-6b79-45b3-813b-52b5144b522e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.718 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.739 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.743 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:03 np0005546909 nova_compute[187208]: 2025-12-05 12:06:03.764 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:04Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8a:a8:16 10.100.0.13
Dec  5 07:06:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:04Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8a:a8:16 10.100.0.13
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.403 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.403 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.430 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.512 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.513 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.518 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.519 187212 INFO nova.compute.claims [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.776 187212 DEBUG nova.compute.provider_tree [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.799 187212 DEBUG nova.scheduler.client.report [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.821 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.822 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.877 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.878 187212 DEBUG nova.network.neutron [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.913 187212 INFO nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:06:04 np0005546909 nova_compute[187208]: 2025-12-05 12:06:04.934 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.032 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.035 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.036 187212 INFO nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Creating image(s)#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.037 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.037 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.038 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.055 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.130 187212 DEBUG nova.policy [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.134 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.135 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.135 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.146 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.220 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.221 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.255 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.256 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.256 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.317 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.318 187212 DEBUG nova.virt.disk.api [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Checking if we can resize image /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.318 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.377 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.378 187212 DEBUG nova.virt.disk.api [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Cannot resize image /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.378 187212 DEBUG nova.objects.instance [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'migration_context' on Instance uuid 25918fc4-05ec-4a16-b77f-ca1d352a2763 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.399 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.400 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Ensure instance console log exists: /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.400 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.401 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:05 np0005546909 nova_compute[187208]: 2025-12-05 12:06:05.401 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:06 np0005546909 podman[226059]: 2025-12-05 12:06:06.232201257 +0000 UTC m=+0.076089859 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Dec  5 07:06:06 np0005546909 nova_compute[187208]: 2025-12-05 12:06:06.512 187212 DEBUG nova.network.neutron [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Successfully created port: 2064bfa7-125e-466c-9365-6c0ec6655113 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:06:06 np0005546909 nova_compute[187208]: 2025-12-05 12:06:06.789 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:06 np0005546909 nova_compute[187208]: 2025-12-05 12:06:06.812 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:06 np0005546909 nova_compute[187208]: 2025-12-05 12:06:06.813 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:06 np0005546909 nova_compute[187208]: 2025-12-05 12:06:06.833 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:06:06 np0005546909 nova_compute[187208]: 2025-12-05 12:06:06.914 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:06 np0005546909 nova_compute[187208]: 2025-12-05 12:06:06.915 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:06 np0005546909 nova_compute[187208]: 2025-12-05 12:06:06.921 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:06:06 np0005546909 nova_compute[187208]: 2025-12-05 12:06:06.922 187212 INFO nova.compute.claims [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.151 187212 DEBUG nova.compute.provider_tree [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.167 187212 DEBUG nova.scheduler.client.report [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.192 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.193 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.259 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.260 187212 DEBUG nova.network.neutron [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.286 187212 INFO nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.311 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.432 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.434 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.435 187212 INFO nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Creating image(s)#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.436 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "/var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.436 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "/var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.437 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "/var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.461 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.522 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.523 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.523 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.535 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.592 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.599 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.600 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.627 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.648 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.649 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.649 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.709 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.710 187212 DEBUG nova.virt.disk.api [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Checking if we can resize image /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.711 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.766 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.767 187212 DEBUG nova.virt.disk.api [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Cannot resize image /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.767 187212 DEBUG nova.objects.instance [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lazy-loading 'migration_context' on Instance uuid bcdca3f9-3e24-4209-808c-8093b55e5c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.789 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.789 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Ensure instance console log exists: /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.790 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.790 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:07 np0005546909 nova_compute[187208]: 2025-12-05 12:06:07.790 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:08 np0005546909 nova_compute[187208]: 2025-12-05 12:06:08.113 187212 DEBUG nova.policy [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:06:10 np0005546909 nova_compute[187208]: 2025-12-05 12:06:10.074 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:10 np0005546909 nova_compute[187208]: 2025-12-05 12:06:10.613 187212 DEBUG nova.network.neutron [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Successfully updated port: 2064bfa7-125e-466c-9365-6c0ec6655113 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:06:10 np0005546909 nova_compute[187208]: 2025-12-05 12:06:10.645 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:10 np0005546909 nova_compute[187208]: 2025-12-05 12:06:10.646 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:10 np0005546909 nova_compute[187208]: 2025-12-05 12:06:10.646 187212 DEBUG nova.network.neutron [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.043 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.044 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.072 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.153 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.153 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.160 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.160 187212 INFO nova.compute.claims [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.207 187212 DEBUG nova.network.neutron [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.400 187212 DEBUG nova.compute.provider_tree [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.419 187212 DEBUG nova.scheduler.client.report [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.442 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.443 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.493 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.494 187212 DEBUG nova.network.neutron [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.513 187212 INFO nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.528 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.615 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.616 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.617 187212 INFO nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Creating image(s)#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.617 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.617 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.618 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.633 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.669 187212 DEBUG nova.network.neutron [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Successfully created port: 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.698 187212 DEBUG nova.policy [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.723 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.724 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.724 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.734 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.792 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.795 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:11 np0005546909 nova_compute[187208]: 2025-12-05 12:06:11.795 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.034 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk 1073741824" returned: 0 in 0.239s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.035 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.036 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.061 187212 DEBUG nova.compute.manager [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-changed-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.062 187212 DEBUG nova.compute.manager [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Refreshing instance network info cache due to event network-changed-2064bfa7-125e-466c-9365-6c0ec6655113. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.062 187212 DEBUG oslo_concurrency.lockutils [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.110 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.111 187212 DEBUG nova.virt.disk.api [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Checking if we can resize image /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.111 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.165 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.167 187212 DEBUG nova.virt.disk.api [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Cannot resize image /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.168 187212 DEBUG nova.objects.instance [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'migration_context' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.189 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.190 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Ensure instance console log exists: /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.191 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.191 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.192 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.619 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.655 187212 DEBUG nova.network.neutron [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.681 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.682 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Instance network_info: |[{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.682 187212 DEBUG oslo_concurrency.lockutils [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.682 187212 DEBUG nova.network.neutron [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Refreshing network info cache for port 2064bfa7-125e-466c-9365-6c0ec6655113 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.685 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Start _get_guest_xml network_info=[{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.689 187212 WARNING nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.726 187212 DEBUG nova.virt.libvirt.host [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.727 187212 DEBUG nova.virt.libvirt.host [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.734 187212 DEBUG nova.virt.libvirt.host [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.734 187212 DEBUG nova.virt.libvirt.host [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.734 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.735 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.735 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.735 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.735 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.736 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.736 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.736 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.736 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.737 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.737 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.737 187212 DEBUG nova.virt.hardware [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.740 187212 DEBUG nova.virt.libvirt.vif [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.741 187212 DEBUG nova.network.os_vif_util [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.742 187212 DEBUG nova.network.os_vif_util [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.743 187212 DEBUG nova.objects.instance [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_devices' on Instance uuid 25918fc4-05ec-4a16-b77f-ca1d352a2763 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.759 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  <uuid>25918fc4-05ec-4a16-b77f-ca1d352a2763</uuid>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  <name>instance-0000003c</name>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <nova:name>tempest-AttachInterfacesTestJSON-server-1604830094</nova:name>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:06:12</nova:creationTime>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:        <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:        <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:        <nova:port uuid="2064bfa7-125e-466c-9365-6c0ec6655113">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <entry name="serial">25918fc4-05ec-4a16-b77f-ca1d352a2763</entry>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <entry name="uuid">25918fc4-05ec-4a16-b77f-ca1d352a2763</entry>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.config"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:7b:68:b7"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <target dev="tap2064bfa7-12"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/console.log" append="off"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:06:12 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:06:12 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:06:12 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:06:12 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.760 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Preparing to wait for external event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.761 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.761 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.761 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.762 187212 DEBUG nova.virt.libvirt.vif [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:04Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.762 187212 DEBUG nova.network.os_vif_util [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.763 187212 DEBUG nova.network.os_vif_util [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.763 187212 DEBUG os_vif [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.764 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.764 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.764 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.767 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.767 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2064bfa7-12, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.767 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2064bfa7-12, col_values=(('external_ids', {'iface-id': '2064bfa7-125e-466c-9365-6c0ec6655113', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:68:b7', 'vm-uuid': '25918fc4-05ec-4a16-b77f-ca1d352a2763'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.768 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:12 np0005546909 NetworkManager[55691]: <info>  [1764936372.7701] manager: (tap2064bfa7-12): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/199)
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.771 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.777 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.778 187212 INFO os_vif [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12')#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.850 187212 DEBUG nova.network.neutron [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Successfully created port: ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:06:12 np0005546909 podman[226118]: 2025-12-05 12:06:12.872888374 +0000 UTC m=+0.050346898 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.926 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.927 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.927 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:7b:68:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:06:12 np0005546909 nova_compute[187208]: 2025-12-05 12:06:12.928 187212 INFO nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Using config drive#033[00m
Dec  5 07:06:13 np0005546909 nova_compute[187208]: 2025-12-05 12:06:13.574 187212 DEBUG nova.compute.manager [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:13 np0005546909 nova_compute[187208]: 2025-12-05 12:06:13.616 187212 INFO nova.compute.manager [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] instance snapshotting#033[00m
Dec  5 07:06:13 np0005546909 nova_compute[187208]: 2025-12-05 12:06:13.617 187212 DEBUG nova.objects.instance [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'flavor' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:13 np0005546909 nova_compute[187208]: 2025-12-05 12:06:13.812 187212 DEBUG nova.network.neutron [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Successfully updated port: ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:06:13 np0005546909 nova_compute[187208]: 2025-12-05 12:06:13.829 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:13 np0005546909 nova_compute[187208]: 2025-12-05 12:06:13.830 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:13 np0005546909 nova_compute[187208]: 2025-12-05 12:06:13.831 187212 DEBUG nova.network.neutron [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:06:13 np0005546909 nova_compute[187208]: 2025-12-05 12:06:13.851 187212 INFO nova.virt.libvirt.driver [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Beginning live snapshot process#033[00m
Dec  5 07:06:13 np0005546909 nova_compute[187208]: 2025-12-05 12:06:13.857 187212 INFO nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Creating config drive at /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.config#033[00m
Dec  5 07:06:13 np0005546909 nova_compute[187208]: 2025-12-05 12:06:13.867 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi1gdg89p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:13 np0005546909 nova_compute[187208]: 2025-12-05 12:06:13.982 187212 DEBUG nova.network.neutron [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.012 187212 DEBUG oslo_concurrency.processutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi1gdg89p" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:14 np0005546909 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.041 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:14Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:3c:38 10.100.0.11
Dec  5 07:06:14 np0005546909 kernel: tap2064bfa7-12: entered promiscuous mode
Dec  5 07:06:14 np0005546909 NetworkManager[55691]: <info>  [1764936374.0723] manager: (tap2064bfa7-12): new Tun device (/org/freedesktop/NetworkManager/Devices/200)
Dec  5 07:06:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:14Z|00469|binding|INFO|Claiming lport 2064bfa7-125e-466c-9365-6c0ec6655113 for this chassis.
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.073 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:14Z|00470|binding|INFO|2064bfa7-125e-466c-9365-6c0ec6655113: Claiming fa:16:3e:7b:68:b7 10.100.0.12
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.089 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:68:b7 10.100.0.12'], port_security=['fa:16:3e:7b:68:b7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '38cb0acb-7ac3-4fef-baeb-661c59e2e07c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2064bfa7-125e-466c-9365-6c0ec6655113) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.090 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:14Z|00471|binding|INFO|Setting lport 2064bfa7-125e-466c-9365-6c0ec6655113 up in Southbound
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.091 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2064bfa7-125e-466c-9365-6c0ec6655113 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis#033[00m
Dec  5 07:06:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:14Z|00472|binding|INFO|Setting lport 2064bfa7-125e-466c-9365-6c0ec6655113 ovn-installed in OVS
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.093 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.093 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:14Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:3c:38 10.100.0.11
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.107 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[afc63d61-e925-48de-933f-4e4b9fded5e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.108 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbfed6fc-31 in ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.111 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbfed6fc-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.111 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6da0ed74-6add-46cd-8a87-81b5ad9d87ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.112 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[494f7270-3b7f-4bc9-81fc-8ff52fa93305]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 systemd-machined[153543]: New machine qemu-64-instance-0000003c.
Dec  5 07:06:14 np0005546909 systemd-udevd[226178]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.129 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7636037e-dda2-4f30-833f-78418ed8f4b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 systemd[1]: Started Virtual Machine qemu-64-instance-0000003c.
Dec  5 07:06:14 np0005546909 NetworkManager[55691]: <info>  [1764936374.1379] device (tap2064bfa7-12): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:06:14 np0005546909 NetworkManager[55691]: <info>  [1764936374.1408] device (tap2064bfa7-12): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.148 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.148 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.155 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1398c713-9048-4492-886f-c0df2f6d4b0d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.190 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f563280b-c6fe-4a87-89b4-49aa86167290]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 systemd-udevd[226183]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:06:14 np0005546909 NetworkManager[55691]: <info>  [1764936374.1963] manager: (tapfbfed6fc-30): new Veth device (/org/freedesktop/NetworkManager/Devices/201)
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.195 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ddf72da1-39e2-4ffe-b580-16f3fb4ba1bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.228 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.240 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.229 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4ea2dfe0-40dc-4425-b174-162ca3996ecd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.255 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e1cce84e-508c-4543-8211-df7e947b13f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 NetworkManager[55691]: <info>  [1764936374.2774] device (tapfbfed6fc-30): carrier: link connected
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.287 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0f37e0ba-c8eb-4120-8b1d-887f150b5b9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.304 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0e995db3-12d6-41b9-b361-80a11ba02df3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375484, 'reachable_time': 41038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226215, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.306 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.307 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpkohpd323/854f0e1423c64ac78ed77fb381f37b70.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.319 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e9557ca7-50ae-4e0f-9b1e-068df9ca9a9d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe26:8872'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375484, 'tstamp': 375484}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226218, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.337 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2a3f339b-7a49-4c92-ac6e-ad6916b7e19c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375484, 'reachable_time': 41038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226220, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.359 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpkohpd323/854f0e1423c64ac78ed77fb381f37b70.delta 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.360 187212 INFO nova.virt.libvirt.driver [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.368 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2d83ee95-ffc6-4882-9a17-51867efb0c1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.416 187212 DEBUG nova.virt.libvirt.guest [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.437 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d76d088-a36a-4a01-8dda-ab47e6241b73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.438 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.438 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.439 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:14 np0005546909 NetworkManager[55691]: <info>  [1764936374.4413] manager: (tapfbfed6fc-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Dec  5 07:06:14 np0005546909 kernel: tapfbfed6fc-30: entered promiscuous mode
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.440 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.446 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.447 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:14Z|00473|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.460 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.462 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.463 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0a7e2ae5-9a7d-4c33-9670-70068e5468cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.464 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:06:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:14.465 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'env', 'PROCESS_TAG=haproxy-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.668 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936374.6677818, 25918fc4-05ec-4a16-b77f-ca1d352a2763 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.668 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] VM Started (Lifecycle Event)#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.689 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.693 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936374.6679525, 25918fc4-05ec-4a16-b77f-ca1d352a2763 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.694 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.715 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.719 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.737 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:14 np0005546909 podman[226273]: 2025-12-05 12:06:14.805973748 +0000 UTC m=+0.049675749 container create b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:06:14 np0005546909 systemd[1]: Started libpod-conmon-b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934.scope.
Dec  5 07:06:14 np0005546909 podman[226273]: 2025-12-05 12:06:14.780139195 +0000 UTC m=+0.023841226 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:06:14 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:06:14 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39a6a101eec5b4484b9949c620581716165bc1e8dfc205f2cf46df83cc7fa1cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:06:14 np0005546909 podman[226273]: 2025-12-05 12:06:14.911368998 +0000 UTC m=+0.155071019 container init b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec  5 07:06:14 np0005546909 podman[226273]: 2025-12-05 12:06:14.918964866 +0000 UTC m=+0.162666867 container start b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.920 187212 DEBUG nova.virt.libvirt.guest [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.925 187212 INFO nova.virt.libvirt.driver [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Dec  5 07:06:14 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [NOTICE]   (226293) : New worker (226296) forked
Dec  5 07:06:14 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [NOTICE]   (226293) : Loading success.
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.969 187212 DEBUG nova.privsep.utils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:06:14 np0005546909 nova_compute[187208]: 2025-12-05 12:06:14.970 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpkohpd323/854f0e1423c64ac78ed77fb381f37b70.delta /var/lib/nova/instances/snapshots/tmpkohpd323/854f0e1423c64ac78ed77fb381f37b70 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.081 187212 DEBUG nova.network.neutron [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Successfully updated port: 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.103 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.104 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.104 187212 DEBUG nova.network.neutron [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.394 187212 DEBUG oslo_concurrency.processutils [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpkohpd323/854f0e1423c64ac78ed77fb381f37b70.delta /var/lib/nova/instances/snapshots/tmpkohpd323/854f0e1423c64ac78ed77fb381f37b70" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.400 187212 INFO nova.virt.libvirt.driver [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.604 187212 DEBUG nova.network.neutron [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.658 187212 DEBUG nova.network.neutron [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updated VIF entry in instance network info cache for port 2064bfa7-125e-466c-9365-6c0ec6655113. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.659 187212 DEBUG nova.network.neutron [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.682 187212 DEBUG oslo_concurrency.lockutils [req-3c3c4081-2cc8-4e54-aec2-c056b2ef01d9 req-6a3a394b-2221-4a19-956a-6d001b2e7fee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.738 187212 DEBUG nova.compute.manager [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-changed-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.738 187212 DEBUG nova.compute.manager [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Refreshing instance network info cache due to event network-changed-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:15 np0005546909 nova_compute[187208]: 2025-12-05 12:06:15.738 187212 DEBUG oslo_concurrency.lockutils [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.070 187212 DEBUG nova.network.neutron [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.783 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.784 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance network_info: |[{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.784 187212 DEBUG oslo_concurrency.lockutils [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.784 187212 DEBUG nova.network.neutron [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Refreshing network info cache for port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.787 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Start _get_guest_xml network_info=[{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.791 187212 WARNING nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.797 187212 DEBUG nova.virt.libvirt.host [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.797 187212 DEBUG nova.virt.libvirt.host [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.804 187212 DEBUG nova.virt.libvirt.host [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.805 187212 DEBUG nova.virt.libvirt.host [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.805 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.805 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.806 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.806 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.806 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.807 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.807 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.807 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.807 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.808 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.808 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.808 187212 DEBUG nova.virt.hardware [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.813 187212 DEBUG nova.virt.libvirt.vif [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-795100487',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-795100487',id=62,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHKhL003clvQeWhyQnRnlaccZLUvEBLEhvImBOCB5geqDizgWJsGjayma/8q9qGL/NiGPTPxEZoxanWZnFRBuZklxJy5hDaSwVjbF4FtdnX9ysLeFgNsQAX0H4LK24ei2Q==',key_name='tempest-keypair-105541899',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d62df5807554f499d26b5fc77ec8603',ramdisk_id='',reservation_id='r-zgvbze4r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1858452545',owner_user_name='tempest-AttachVolumeShelveTestJSON-1858452545-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc4332be3b424a5e996b61b244505cfc',uuid=5d70ac2d-111f-4e1b-ac26-3e02849b0458,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.813 187212 DEBUG nova.network.os_vif_util [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converting VIF {"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.814 187212 DEBUG nova.network.os_vif_util [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.815 187212 DEBUG nova.objects.instance [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.832 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  <uuid>5d70ac2d-111f-4e1b-ac26-3e02849b0458</uuid>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  <name>instance-0000003e</name>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-795100487</nova:name>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:06:16</nova:creationTime>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:        <nova:user uuid="bc4332be3b424a5e996b61b244505cfc">tempest-AttachVolumeShelveTestJSON-1858452545-project-member</nova:user>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:        <nova:project uuid="6d62df5807554f499d26b5fc77ec8603">tempest-AttachVolumeShelveTestJSON-1858452545</nova:project>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:        <nova:port uuid="ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <entry name="serial">5d70ac2d-111f-4e1b-ac26-3e02849b0458</entry>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <entry name="uuid">5d70ac2d-111f-4e1b-ac26-3e02849b0458</entry>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:6a:c5:99"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <target dev="tapac02dd63-5a"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/console.log" append="off"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:06:16 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:06:16 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:06:16 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:06:16 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.834 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Preparing to wait for external event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.834 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.834 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.834 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.835 187212 DEBUG nova.virt.libvirt.vif [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-795100487',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-795100487',id=62,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHKhL003clvQeWhyQnRnlaccZLUvEBLEhvImBOCB5geqDizgWJsGjayma/8q9qGL/NiGPTPxEZoxanWZnFRBuZklxJy5hDaSwVjbF4FtdnX9ysLeFgNsQAX0H4LK24ei2Q==',key_name='tempest-keypair-105541899',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d62df5807554f499d26b5fc77ec8603',ramdisk_id='',reservation_id='r-zgvbze4r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1858452545',owner_user_name='tempest-AttachVolumeShelveTestJSON-1858452545-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc4332be3b424a5e996b61b244505cfc',uuid=5d70ac2d-111f-4e1b-ac26-3e02849b0458,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.835 187212 DEBUG nova.network.os_vif_util [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converting VIF {"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.836 187212 DEBUG nova.network.os_vif_util [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.836 187212 DEBUG os_vif [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.837 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.837 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.839 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.840 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac02dd63-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.840 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapac02dd63-5a, col_values=(('external_ids', {'iface-id': 'ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:c5:99', 'vm-uuid': '5d70ac2d-111f-4e1b-ac26-3e02849b0458'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.841 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:16 np0005546909 NetworkManager[55691]: <info>  [1764936376.8429] manager: (tapac02dd63-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.849 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.854 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.855 187212 INFO os_vif [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a')#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.902 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.902 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.903 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] No VIF found with MAC fa:16:3e:6a:c5:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:06:16 np0005546909 nova_compute[187208]: 2025-12-05 12:06:16.903 187212 INFO nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Using config drive#033[00m
Dec  5 07:06:17 np0005546909 nova_compute[187208]: 2025-12-05 12:06:17.620 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:17 np0005546909 nova_compute[187208]: 2025-12-05 12:06:17.856 187212 INFO nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Creating config drive at /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config#033[00m
Dec  5 07:06:17 np0005546909 nova_compute[187208]: 2025-12-05 12:06:17.865 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsu2ryloe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:17 np0005546909 nova_compute[187208]: 2025-12-05 12:06:17.993 187212 DEBUG oslo_concurrency.processutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsu2ryloe" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:18 np0005546909 kernel: tapac02dd63-5a: entered promiscuous mode
Dec  5 07:06:18 np0005546909 NetworkManager[55691]: <info>  [1764936378.0778] manager: (tapac02dd63-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.078 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:18Z|00474|binding|INFO|Claiming lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for this chassis.
Dec  5 07:06:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:18Z|00475|binding|INFO|ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b: Claiming fa:16:3e:6a:c5:99 10.100.0.8
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.086 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:c5:99 10.100.0.8'], port_security=['fa:16:3e:6a:c5:99 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d62df5807554f499d26b5fc77ec8603', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5a04f4af-e81b-4661-95ed-5737ffc98cae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a7d298f-265e-44c5-a73a-18dd9ed0b171, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.088 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b in datapath fc6ce614-d0f7-413f-bc3e-26f7271993d9 bound to our chassis#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.091 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc6ce614-d0f7-413f-bc3e-26f7271993d9#033[00m
Dec  5 07:06:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:18Z|00476|binding|INFO|Setting lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b ovn-installed in OVS
Dec  5 07:06:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:18Z|00477|binding|INFO|Setting lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b up in Southbound
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.095 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.098 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.104 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ce3627a7-f3d6-4b18-9a7e-c9d02ee8b4fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.105 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfc6ce614-d1 in ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.107 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfc6ce614-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.107 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5de38a04-4fd3-4623-9712-2c1cfae14d42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.108 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0906a4b1-50d4-4679-9b10-d48f713a5b49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.121 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[10a0bd83-b779-4382-b34d-67695d5d4156]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 systemd-udevd[226339]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:06:18 np0005546909 systemd-machined[153543]: New machine qemu-65-instance-0000003e.
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.138 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[90554aae-5851-4cbf-b7c0-8d8d1f56eb2b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 systemd[1]: Started Virtual Machine qemu-65-instance-0000003e.
Dec  5 07:06:18 np0005546909 NetworkManager[55691]: <info>  [1764936378.1447] device (tapac02dd63-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:06:18 np0005546909 NetworkManager[55691]: <info>  [1764936378.1460] device (tapac02dd63-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.169 187212 DEBUG nova.compute.manager [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.169 187212 DEBUG nova.compute.manager [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing instance network info cache due to event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.169 187212 DEBUG oslo_concurrency.lockutils [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.167 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0120a36d-9ab0-4fbd-a173-66f1e64da165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.176 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f86e91ca-e99e-4a4f-a078-0ab5dab2b86d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 NetworkManager[55691]: <info>  [1764936378.1771] manager: (tapfc6ce614-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/205)
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.208 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[33788a94-f500-4c55-9bbe-930f00588e6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.212 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[dfc8d872-8986-43a5-af94-b50fccc13e6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.231 187212 DEBUG nova.network.neutron [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updated VIF entry in instance network info cache for port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.231 187212 DEBUG nova.network.neutron [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:18 np0005546909 NetworkManager[55691]: <info>  [1764936378.2418] device (tapfc6ce614-d0): carrier: link connected
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.248 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce09589-08a1-48c1-9cfa-765b06e7b640]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.253 187212 DEBUG oslo_concurrency.lockutils [req-8dc50563-e8e5-472b-906b-0bd743df8070 req-355b72b9-ef4b-46ee-9368-f28078bdf1f2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.267 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2fd0be-f9ee-4bd4-bdde-6ca4e9d0fb93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc6ce614-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:6b:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375880, 'reachable_time': 38901, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226372, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.281 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a818749b-4ee2-4aeb-a069-e3702fcf1cfb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:6b90'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375880, 'tstamp': 375880}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226373, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.298 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ea07bec4-e61f-4978-8f04-c665338ce61c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc6ce614-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:6b:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375880, 'reachable_time': 38901, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226374, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.315 187212 DEBUG nova.network.neutron [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.332 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.332 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Instance network_info: |[{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.332 187212 DEBUG oslo_concurrency.lockutils [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.333 187212 DEBUG nova.network.neutron [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.335 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Start _get_guest_xml network_info=[{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.336 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[13abc249-864d-4fb6-b8b5-bd5c2de15a5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.339 187212 WARNING nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.346 187212 DEBUG nova.virt.libvirt.host [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.347 187212 DEBUG nova.virt.libvirt.host [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.353 187212 DEBUG nova.virt.libvirt.host [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.353 187212 DEBUG nova.virt.libvirt.host [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.353 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.353 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.354 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.354 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.354 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.354 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.355 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.355 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.355 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.355 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.355 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.356 187212 DEBUG nova.virt.hardware [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.359 187212 DEBUG nova.virt.libvirt.vif [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-2105634627',display_name='tempest-AttachInterfacesUnderV243Test-server-2105634627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-2105634627',id=61,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNE7kQOo1iw7msO5U3UKQiYNUNOuR3489N27cA8/7AyK9hUMINDB4EKPtuAqKWiOpLa6/9d1/JcrFvBfelk3gje2Ue6XSif/X6uD8HtKgekiyZF9ENjW4HKYytyiU96vgQ==',key_name='tempest-keypair-865071651',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5285f99befb24ac285be8e4fc1d18e69',ramdisk_id='',reservation_id='r-93zclce8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-1358924829',owner_user_name='tempest-AttachInterfacesUnderV243Test-1358924829-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6b73160d333a43ed94d4258262e3c2b5',uuid=bcdca3f9-3e24-4209-808c-8093b55e5c2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.359 187212 DEBUG nova.network.os_vif_util [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Converting VIF {"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.360 187212 DEBUG nova.network.os_vif_util [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.361 187212 DEBUG nova.objects.instance [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lazy-loading 'pci_devices' on Instance uuid bcdca3f9-3e24-4209-808c-8093b55e5c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.377 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  <uuid>bcdca3f9-3e24-4209-808c-8093b55e5c2d</uuid>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  <name>instance-0000003d</name>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <nova:name>tempest-AttachInterfacesUnderV243Test-server-2105634627</nova:name>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:06:18</nova:creationTime>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:        <nova:user uuid="6b73160d333a43ed94d4258262e3c2b5">tempest-AttachInterfacesUnderV243Test-1358924829-project-member</nova:user>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:        <nova:project uuid="5285f99befb24ac285be8e4fc1d18e69">tempest-AttachInterfacesUnderV243Test-1358924829</nova:project>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:        <nova:port uuid="88c7b630-e84b-4a35-8c8f-f934e7cabaf6">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <entry name="serial">bcdca3f9-3e24-4209-808c-8093b55e5c2d</entry>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <entry name="uuid">bcdca3f9-3e24-4209-808c-8093b55e5c2d</entry>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.config"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:bb:19:b7"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <target dev="tap88c7b630-e8"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/console.log" append="off"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:06:18 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:06:18 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:06:18 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:06:18 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.377 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Preparing to wait for external event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.378 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.378 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.378 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.379 187212 DEBUG nova.virt.libvirt.vif [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-2105634627',display_name='tempest-AttachInterfacesUnderV243Test-server-2105634627',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-2105634627',id=61,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNE7kQOo1iw7msO5U3UKQiYNUNOuR3489N27cA8/7AyK9hUMINDB4EKPtuAqKWiOpLa6/9d1/JcrFvBfelk3gje2Ue6XSif/X6uD8HtKgekiyZF9ENjW4HKYytyiU96vgQ==',key_name='tempest-keypair-865071651',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5285f99befb24ac285be8e4fc1d18e69',ramdisk_id='',reservation_id='r-93zclce8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-1358924829',owner_user_name='tempest-AttachInterfacesUnderV243Test-1358924829-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6b73160d333a43ed94d4258262e3c2b5',uuid=bcdca3f9-3e24-4209-808c-8093b55e5c2d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.379 187212 DEBUG nova.network.os_vif_util [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Converting VIF {"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.379 187212 DEBUG nova.network.os_vif_util [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.380 187212 DEBUG os_vif [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.380 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.380 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.380 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.382 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.382 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88c7b630-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.383 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap88c7b630-e8, col_values=(('external_ids', {'iface-id': '88c7b630-e84b-4a35-8c8f-f934e7cabaf6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:19:b7', 'vm-uuid': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.384 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:18 np0005546909 NetworkManager[55691]: <info>  [1764936378.3855] manager: (tap88c7b630-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.386 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.390 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.390 187212 INFO os_vif [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8')#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.396 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.397 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.413 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fae879e6-1f58-428f-abc8-d73bcde91af1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.415 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc6ce614-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.416 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.416 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc6ce614-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.418 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:06:18 np0005546909 NetworkManager[55691]: <info>  [1764936378.4199] manager: (tapfc6ce614-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Dec  5 07:06:18 np0005546909 kernel: tapfc6ce614-d0: entered promiscuous mode
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.421 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.423 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.427 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc6ce614-d0, col_values=(('external_ids', {'iface-id': '1b193bb7-c39e-445c-9a2c-dd8ee58553b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:18 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:18Z|00478|binding|INFO|Releasing lport 1b193bb7-c39e-445c-9a2c-dd8ee58553b9 from this chassis (sb_readonly=0)
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.428 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.440 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.445 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.446 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fc6ce614-d0f7-413f-bc3e-26f7271993d9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fc6ce614-d0f7-413f-bc3e-26f7271993d9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.447 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[26fb0e5b-6fe5-4cba-b660-d2a4adfdcdd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.447 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-fc6ce614-d0f7-413f-bc3e-26f7271993d9
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/fc6ce614-d0f7-413f-bc3e-26f7271993d9.pid.haproxy
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID fc6ce614-d0f7-413f-bc3e-26f7271993d9
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:06:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:18.448 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'env', 'PROCESS_TAG=haproxy-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fc6ce614-d0f7-413f-bc3e-26f7271993d9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.470 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.471 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.471 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] No VIF found with MAC fa:16:3e:bb:19:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.471 187212 INFO nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Using config drive#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.501 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.501 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.518 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.518 187212 INFO nova.compute.claims [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.705 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936378.7054462, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.706 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Started (Lifecycle Event)#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.731 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.735 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936378.7056284, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.735 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.737 187212 DEBUG nova.compute.provider_tree [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.759 187212 DEBUG nova.scheduler.client.report [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.763 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.766 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.785 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.786 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.789 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.836 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.836 187212 DEBUG nova.network.neutron [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:06:18 np0005546909 podman[226418]: 2025-12-05 12:06:18.838783623 +0000 UTC m=+0.056644740 container create 161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.855 187212 INFO nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.872 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:06:18 np0005546909 systemd[1]: Started libpod-conmon-161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c.scope.
Dec  5 07:06:18 np0005546909 podman[226418]: 2025-12-05 12:06:18.803712325 +0000 UTC m=+0.021573472 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:06:18 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:06:18 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1f29ee6b8791a73cdc634bfdec56c186b809e5d9ad3d5a603996e4c487e56a0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:06:18 np0005546909 podman[226418]: 2025-12-05 12:06:18.938363016 +0000 UTC m=+0.156224153 container init 161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 07:06:18 np0005546909 podman[226418]: 2025-12-05 12:06:18.944132271 +0000 UTC m=+0.161993388 container start 161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  5 07:06:18 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [NOTICE]   (226438) : New worker (226440) forked
Dec  5 07:06:18 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [NOTICE]   (226438) : Loading success.
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.975 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.976 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.977 187212 INFO nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Creating image(s)#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.977 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "/var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.978 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "/var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.978 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "/var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:18 np0005546909 nova_compute[187208]: 2025-12-05 12:06:18.991 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.047 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.048 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.049 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.061 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.084 187212 INFO nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Creating config drive at /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.config#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.092 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzxlj9h5z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.126 187212 INFO nova.virt.libvirt.driver [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot image upload complete#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.129 187212 INFO nova.compute.manager [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 5.48 seconds to snapshot the instance on the hypervisor.#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.133 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.133 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.172 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.173 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.173 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.200 187212 DEBUG nova.policy [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.225 187212 DEBUG oslo_concurrency.processutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzxlj9h5z" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.231 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.233 187212 DEBUG nova.virt.disk.api [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Checking if we can resize image /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.233 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.297 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.298 187212 DEBUG nova.virt.disk.api [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Cannot resize image /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.299 187212 DEBUG nova.objects.instance [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'migration_context' on Instance uuid e9f9bf08-7688-4213-91ff-74f2271ec71d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:19 np0005546909 kernel: tap88c7b630-e8: entered promiscuous mode
Dec  5 07:06:19 np0005546909 NetworkManager[55691]: <info>  [1764936379.3116] manager: (tap88c7b630-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/208)
Dec  5 07:06:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:19Z|00479|binding|INFO|Claiming lport 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 for this chassis.
Dec  5 07:06:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:19Z|00480|binding|INFO|88c7b630-e84b-4a35-8c8f-f934e7cabaf6: Claiming fa:16:3e:bb:19:b7 10.100.0.7
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.314 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:19 np0005546909 systemd-udevd[226357]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.316 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.316 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Ensure instance console log exists: /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.317 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.317 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.317 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.322 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:19:b7 10.100.0.7'], port_security=['fa:16:3e:bb:19:b7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0566af06-3837-49db-a95c-47b9857e4e90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5285f99befb24ac285be8e4fc1d18e69', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a5c5fedc-8874-4d17-85d6-f832393ee546', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b689627-4043-49f3-b45a-0160a35a0a18, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=88c7b630-e84b-4a35-8c8f-f934e7cabaf6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.323 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 in datapath 0566af06-3837-49db-a95c-47b9857e4e90 bound to our chassis#033[00m
Dec  5 07:06:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:19Z|00481|binding|INFO|Setting lport 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 ovn-installed in OVS
Dec  5 07:06:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:19Z|00482|binding|INFO|Setting lport 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 up in Southbound
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.326 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0566af06-3837-49db-a95c-47b9857e4e90#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.326 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:19 np0005546909 NetworkManager[55691]: <info>  [1764936379.3339] device (tap88c7b630-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:06:19 np0005546909 NetworkManager[55691]: <info>  [1764936379.3349] device (tap88c7b630-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.337 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d5b403-9c27-4c11-9b1e-15f5b7da8c9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.337 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0566af06-31 in ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.339 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0566af06-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.339 187212 DEBUG nova.compute.manager [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.340 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e875b99e-e9d4-41c3-b15e-8aa07b6f91d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.340 187212 DEBUG oslo_concurrency.lockutils [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.340 187212 DEBUG oslo_concurrency.lockutils [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.340 187212 DEBUG oslo_concurrency.lockutils [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.341 187212 DEBUG nova.compute.manager [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Processing event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.341 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[077ddd6b-f96a-4477-84f6-a83390c150b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.341 187212 DEBUG nova.compute.manager [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.341 187212 DEBUG oslo_concurrency.lockutils [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.341 187212 DEBUG oslo_concurrency.lockutils [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.342 187212 DEBUG oslo_concurrency.lockutils [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.342 187212 DEBUG nova.compute.manager [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.342 187212 WARNING nova.compute.manager [req-221d7e86-7fe5-42bc-897c-d1805c8d0a97 req-40d7a85d-0160-4b42-ae08-5d731c3c9742 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.343 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.344 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.353 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[f08ff02a-b9db-47ce-9d1e-5e3716e75599]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.358 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936379.3576407, 25918fc4-05ec-4a16-b77f-ca1d352a2763 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.359 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:06:19 np0005546909 systemd-machined[153543]: New machine qemu-66-instance-0000003d.
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.365 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:06:19 np0005546909 systemd[1]: Started Virtual Machine qemu-66-instance-0000003d.
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.369 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[53eca702-e6fc-47e0-ac1b-9ca4abed9026]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.372 187212 INFO nova.virt.libvirt.driver [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Instance spawned successfully.#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.373 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.389 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.396 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.401 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.401 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.401 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.401 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[16118044-a113-4b53-bbbb-3d7d47684d8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.403 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.403 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.403 187212 DEBUG nova.virt.libvirt.driver [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:19 np0005546909 NetworkManager[55691]: <info>  [1764936379.4122] manager: (tap0566af06-30): new Veth device (/org/freedesktop/NetworkManager/Devices/209)
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.409 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[da376776-7135-4a9f-a08c-e6f93f63d831]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.429 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.432 187212 DEBUG nova.compute.manager [None req-c00ee3e2-7fad-4681-ba93-f5fd0bf022d1 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found 1 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.445 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[35060fb1-4e7e-4b86-8e89-2fe96f356020]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.450 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[ada9f199-9492-49df-9bb9-027f549c1063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 podman[226473]: 2025-12-05 12:06:19.456171302 +0000 UTC m=+0.152883526 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.468 187212 INFO nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Took 14.43 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.468 187212 DEBUG nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:19 np0005546909 NetworkManager[55691]: <info>  [1764936379.4789] device (tap0566af06-30): carrier: link connected
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.491 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[493ee2d1-06fe-40e3-8e2c-3c0c4a2e4047]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.508 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3615ea8b-3c17-4601-96e3-b5d16ec709b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0566af06-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:cb:64'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376004, 'reachable_time': 18575, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226519, 'error': None, 'target': 'ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.523 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3234b35d-9c5f-48cd-996f-cfc62630dc06]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee2:cb64'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376004, 'tstamp': 376004}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226520, 'error': None, 'target': 'ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.537 187212 INFO nova.compute.manager [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Took 15.06 seconds to build instance.#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.543 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[78a813ad-1a66-4202-97a6-84dfaa85abff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0566af06-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e2:cb:64'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376004, 'reachable_time': 18575, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226521, 'error': None, 'target': 'ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.556 187212 DEBUG oslo_concurrency.lockutils [None req-5214e60e-e285-41b4-bdf4-b6c12d3074a6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.573 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b331aec8-4c66-4317-a102-89f1cfbc4f81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.637 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[62dd7de7-022a-471c-a4c9-9f566a23909a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.639 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0566af06-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.639 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.639 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0566af06-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.641 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:19 np0005546909 NetworkManager[55691]: <info>  [1764936379.6422] manager: (tap0566af06-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/210)
Dec  5 07:06:19 np0005546909 kernel: tap0566af06-30: entered promiscuous mode
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.645 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.646 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0566af06-30, col_values=(('external_ids', {'iface-id': '08ca2eb6-40e5-4c40-8c65-26542a1d3b4d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.647 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:19Z|00483|binding|INFO|Releasing lport 08ca2eb6-40e5-4c40-8c65-26542a1d3b4d from this chassis (sb_readonly=0)
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.654 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936379.6542509, bcdca3f9-3e24-4209-808c-8093b55e5c2d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.655 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] VM Started (Lifecycle Event)#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.660 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.663 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.663 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0566af06-3837-49db-a95c-47b9857e4e90.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0566af06-3837-49db-a95c-47b9857e4e90.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.664 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aa396f47-89c3-4220-9838-345a494e8efe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.665 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-0566af06-3837-49db-a95c-47b9857e4e90
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/0566af06-3837-49db-a95c-47b9857e4e90.pid.haproxy
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 0566af06-3837-49db-a95c-47b9857e4e90
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:06:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:19.666 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90', 'env', 'PROCESS_TAG=haproxy-0566af06-3837-49db-a95c-47b9857e4e90', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0566af06-3837-49db-a95c-47b9857e4e90.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.672 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.677 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936379.6544235, bcdca3f9-3e24-4209-808c-8093b55e5c2d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.677 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.694 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.699 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:19 np0005546909 nova_compute[187208]: 2025-12-05 12:06:19.725 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:20 np0005546909 podman[226559]: 2025-12-05 12:06:20.076535486 +0000 UTC m=+0.069404756 container create abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:06:20 np0005546909 systemd[1]: Started libpod-conmon-abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4.scope.
Dec  5 07:06:20 np0005546909 podman[226559]: 2025-12-05 12:06:20.032758468 +0000 UTC m=+0.025627768 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:06:20 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:06:20 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b07d63045405ef887718f7278761beb53f95ff32a7c0a77fd47c0591ae50b1a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:06:20 np0005546909 podman[226559]: 2025-12-05 12:06:20.159512642 +0000 UTC m=+0.152381942 container init abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:06:20 np0005546909 podman[226559]: 2025-12-05 12:06:20.166527103 +0000 UTC m=+0.159396373 container start abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec  5 07:06:20 np0005546909 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [NOTICE]   (226578) : New worker (226580) forked
Dec  5 07:06:20 np0005546909 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [NOTICE]   (226578) : Loading success.
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.679 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.679 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.679 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.680 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.680 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Processing event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.680 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.680 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.680 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.681 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.681 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] No waiting events found dispatching network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.681 187212 WARNING nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received unexpected event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.681 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.681 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.681 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.682 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.682 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Processing event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.682 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.682 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.682 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.683 187212 DEBUG oslo_concurrency.lockutils [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.683 187212 DEBUG nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] No waiting events found dispatching network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.683 187212 WARNING nova.compute.manager [req-e5285fcf-3649-4e35-b684-052b95dcd782 req-c77a14e8-86dd-42e9-bcaf-2d4032f46f1c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received unexpected event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.684 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Instance event wait completed in 16 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.684 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.698 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936380.6972528, 297d72ef-6b79-45b3-813b-52b5144b522e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.699 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.702 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.703 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.708 187212 INFO nova.virt.libvirt.driver [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Instance spawned successfully.#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.709 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.711 187212 INFO nova.virt.libvirt.driver [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance spawned successfully.#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.711 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.725 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.732 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.739 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.739 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.740 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.740 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.741 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.741 187212 DEBUG nova.virt.libvirt.driver [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.747 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.747 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.748 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.748 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.749 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.749 187212 DEBUG nova.virt.libvirt.driver [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.754 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.754 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936380.6976998, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.754 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.792 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.795 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.821 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.826 187212 INFO nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Took 9.21 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.827 187212 DEBUG nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.830 187212 INFO nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Took 24.29 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.830 187212 DEBUG nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.916 187212 INFO nova.compute.manager [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Took 9.78 seconds to build instance.#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.921 187212 INFO nova.compute.manager [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Took 24.82 seconds to build instance.#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.939 187212 DEBUG oslo_concurrency.lockutils [None req-ec42b98b-dd14-46dc-9cb9-2b4edae70f9d bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.895s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:20 np0005546909 nova_compute[187208]: 2025-12-05 12:06:20.940 187212 DEBUG oslo_concurrency.lockutils [None req-616e4248-22b1-4b69-8fa9-15a10d62a99a 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.898s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:21 np0005546909 nova_compute[187208]: 2025-12-05 12:06:21.158 187212 DEBUG nova.network.neutron [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Successfully created port: 48b30c48-7858-408b-aeab-df46f6277546 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:06:21 np0005546909 nova_compute[187208]: 2025-12-05 12:06:21.375 187212 DEBUG nova.network.neutron [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updated VIF entry in instance network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:21 np0005546909 nova_compute[187208]: 2025-12-05 12:06:21.376 187212 DEBUG nova.network.neutron [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:21 np0005546909 nova_compute[187208]: 2025-12-05 12:06:21.394 187212 DEBUG oslo_concurrency.lockutils [req-76e291fb-bf8d-4150-88a2-060e2ae8ce02 req-5ffd8e71-e36a-4a55-b654-7ef2f14fae40 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:21 np0005546909 nova_compute[187208]: 2025-12-05 12:06:21.558 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.329 187212 DEBUG nova.compute.manager [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.370 187212 INFO nova.compute.manager [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] instance snapshotting#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.372 187212 DEBUG nova.objects.instance [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'flavor' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.479 187212 DEBUG nova.network.neutron [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Successfully updated port: 48b30c48-7858-408b-aeab-df46f6277546 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.493 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.494 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquired lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.494 187212 DEBUG nova.network.neutron [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.624 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.635 187212 INFO nova.virt.libvirt.driver [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Beginning live snapshot process#033[00m
Dec  5 07:06:22 np0005546909 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.767 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.827 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.829 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.884 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.902 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.966 187212 DEBUG nova.network.neutron [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.970 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:22 np0005546909 nova_compute[187208]: 2025-12-05 12:06:22.971 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpu9kogiwt/5fed2d9a49804cb8bc22caa901567e07.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.166 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpu9kogiwt/5fed2d9a49804cb8bc22caa901567e07.delta 1073741824" returned: 0 in 0.195s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.168 187212 INFO nova.virt.libvirt.driver [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.224 187212 DEBUG nova.virt.libvirt.guest [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.311 187212 DEBUG nova.compute.manager [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-changed-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.312 187212 DEBUG nova.compute.manager [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Refreshing instance network info cache due to event network-changed-48b30c48-7858-408b-aeab-df46f6277546. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.313 187212 DEBUG oslo_concurrency.lockutils [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.385 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.727 187212 DEBUG nova.virt.libvirt.guest [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.731 187212 INFO nova.virt.libvirt.driver [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.770 187212 DEBUG nova.network.neutron [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updating instance_info_cache with network_info: [{"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.774 187212 DEBUG nova.privsep.utils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.775 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpu9kogiwt/5fed2d9a49804cb8bc22caa901567e07.delta /var/lib/nova/instances/snapshots/tmpu9kogiwt/5fed2d9a49804cb8bc22caa901567e07 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.800 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Releasing lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.801 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Instance network_info: |[{"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.802 187212 DEBUG oslo_concurrency.lockutils [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.802 187212 DEBUG nova.network.neutron [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Refreshing network info cache for port 48b30c48-7858-408b-aeab-df46f6277546 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.805 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Start _get_guest_xml network_info=[{"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.809 187212 WARNING nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.816 187212 DEBUG nova.virt.libvirt.host [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.817 187212 DEBUG nova.virt.libvirt.host [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.825 187212 DEBUG nova.virt.libvirt.host [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.826 187212 DEBUG nova.virt.libvirt.host [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.826 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.826 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.827 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.827 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.828 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.828 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.828 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.829 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.829 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.829 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.830 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.830 187212 DEBUG nova.virt.hardware [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.838 187212 DEBUG nova.virt.libvirt.vif [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1685847021',display_name='tempest-SecurityGroupsTestJSON-server-1685847021',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1685847021',id=63,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-52243xe8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:18Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=e9f9bf08-7688-4213-91ff-74f2271ec71d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.839 187212 DEBUG nova.network.os_vif_util [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.841 187212 DEBUG nova.network.os_vif_util [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:23 np0005546909 nova_compute[187208]: 2025-12-05 12:06:23.842 187212 DEBUG nova.objects.instance [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'pci_devices' on Instance uuid e9f9bf08-7688-4213-91ff-74f2271ec71d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:24 np0005546909 nova_compute[187208]: 2025-12-05 12:06:24.282 187212 DEBUG oslo_concurrency.processutils [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpu9kogiwt/5fed2d9a49804cb8bc22caa901567e07.delta /var/lib/nova/instances/snapshots/tmpu9kogiwt/5fed2d9a49804cb8bc22caa901567e07" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:24 np0005546909 nova_compute[187208]: 2025-12-05 12:06:24.287 187212 INFO nova.virt.libvirt.driver [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.229 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  <uuid>e9f9bf08-7688-4213-91ff-74f2271ec71d</uuid>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  <name>instance-0000003f</name>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <nova:name>tempest-SecurityGroupsTestJSON-server-1685847021</nova:name>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:06:23</nova:creationTime>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:        <nova:user uuid="8db061f8c48141d1ac1c3216db1cc7f8">tempest-SecurityGroupsTestJSON-549628149-project-member</nova:user>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:        <nova:project uuid="442a804e3368417d9de1636d533a25e0">tempest-SecurityGroupsTestJSON-549628149</nova:project>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:        <nova:port uuid="48b30c48-7858-408b-aeab-df46f6277546">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <entry name="serial">e9f9bf08-7688-4213-91ff-74f2271ec71d</entry>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <entry name="uuid">e9f9bf08-7688-4213-91ff-74f2271ec71d</entry>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.config"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:62:bb:58"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <target dev="tap48b30c48-78"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/console.log" append="off"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:06:25 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:06:25 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:06:25 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:06:25 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.235 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Preparing to wait for external event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.235 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.236 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.236 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.237 187212 DEBUG nova.virt.libvirt.vif [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1685847021',display_name='tempest-SecurityGroupsTestJSON-server-1685847021',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1685847021',id=63,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-52243xe8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:18Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=e9f9bf08-7688-4213-91ff-74f2271ec71d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.237 187212 DEBUG nova.network.os_vif_util [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.238 187212 DEBUG nova.network.os_vif_util [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.239 187212 DEBUG os_vif [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.240 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.240 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.243 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.243 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48b30c48-78, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.244 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap48b30c48-78, col_values=(('external_ids', {'iface-id': '48b30c48-7858-408b-aeab-df46f6277546', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:bb:58', 'vm-uuid': 'e9f9bf08-7688-4213-91ff-74f2271ec71d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.245 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:25 np0005546909 NetworkManager[55691]: <info>  [1764936385.2467] manager: (tap48b30c48-78): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/211)
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.248 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.253 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.254 187212 INFO os_vif [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78')#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.309 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.309 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.310 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] No VIF found with MAC fa:16:3e:62:bb:58, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.310 187212 INFO nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Using config drive#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.940 187212 INFO nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Creating config drive at /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.config#033[00m
Dec  5 07:06:25 np0005546909 nova_compute[187208]: 2025-12-05 12:06:25.945 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgs9wzbdu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.075 187212 DEBUG oslo_concurrency.processutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgs9wzbdu" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:26 np0005546909 kernel: tap48b30c48-78: entered promiscuous mode
Dec  5 07:06:26 np0005546909 NetworkManager[55691]: <info>  [1764936386.1348] manager: (tap48b30c48-78): new Tun device (/org/freedesktop/NetworkManager/Devices/212)
Dec  5 07:06:26 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:26Z|00484|binding|INFO|Claiming lport 48b30c48-7858-408b-aeab-df46f6277546 for this chassis.
Dec  5 07:06:26 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:26Z|00485|binding|INFO|48b30c48-7858-408b-aeab-df46f6277546: Claiming fa:16:3e:62:bb:58 10.100.0.8
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.136 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.143 187212 DEBUG nova.compute.manager [req-8115a5a1-a6a6-4e6e-a2c2-5d84ce001b6b req-3fb8d852-d57d-4c26-8b2b-3145fef54441 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.144 187212 DEBUG oslo_concurrency.lockutils [req-8115a5a1-a6a6-4e6e-a2c2-5d84ce001b6b req-3fb8d852-d57d-4c26-8b2b-3145fef54441 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.144 187212 DEBUG oslo_concurrency.lockutils [req-8115a5a1-a6a6-4e6e-a2c2-5d84ce001b6b req-3fb8d852-d57d-4c26-8b2b-3145fef54441 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.144 187212 DEBUG oslo_concurrency.lockutils [req-8115a5a1-a6a6-4e6e-a2c2-5d84ce001b6b req-3fb8d852-d57d-4c26-8b2b-3145fef54441 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.144 187212 DEBUG nova.compute.manager [req-8115a5a1-a6a6-4e6e-a2c2-5d84ce001b6b req-3fb8d852-d57d-4c26-8b2b-3145fef54441 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Processing event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.145 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:06:26 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:26Z|00486|binding|INFO|Setting lport 48b30c48-7858-408b-aeab-df46f6277546 ovn-installed in OVS
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.150 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.152 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936386.1508393, bcdca3f9-3e24-4209-808c-8093b55e5c2d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.152 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.155 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.157 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:06:26 np0005546909 systemd-udevd[226638]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:06:26 np0005546909 NetworkManager[55691]: <info>  [1764936386.1757] device (tap48b30c48-78): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:06:26 np0005546909 NetworkManager[55691]: <info>  [1764936386.1763] device (tap48b30c48-78): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.179 187212 INFO nova.virt.libvirt.driver [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Instance spawned successfully.#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.180 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:06:26 np0005546909 systemd-machined[153543]: New machine qemu-67-instance-0000003f.
Dec  5 07:06:26 np0005546909 systemd[1]: Started Virtual Machine qemu-67-instance-0000003f.
Dec  5 07:06:26 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:26Z|00487|binding|INFO|Setting lport 48b30c48-7858-408b-aeab-df46f6277546 up in Southbound
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.207 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:bb:58 10.100.0.8'], port_security=['fa:16:3e:62:bb:58 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd355bd0-560e-4b18-a504-3a5134c930f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '442a804e3368417d9de1636d533a25e0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fbf9a881-7958-4974-8ace-72447edf35a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67381b26-6b90-4d98-928b-9358d69f9e0c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=48b30c48-7858-408b-aeab-df46f6277546) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.209 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 48b30c48-7858-408b-aeab-df46f6277546 in datapath dd355bd0-560e-4b18-a504-3a5134c930f4 bound to our chassis#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.211 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd355bd0-560e-4b18-a504-3a5134c930f4#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.224 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed0607a6-e73e-49c0-9036-61a1c33505cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.225 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdd355bd0-51 in ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.228 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.228 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdd355bd0-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.229 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a6316c00-0ee9-4900-a51f-95efaa27aaeb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.233 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a38dce-7909-4086-9169-9f4a5268f56f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.234 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.240 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.240 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.241 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.241 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.242 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.242 187212 DEBUG nova.virt.libvirt.driver [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.246 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7538f2b1-64af-4931-a288-6f3aee96c7b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.271 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.284 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[81e4b2f4-defc-450b-8aa7-aaaceb16e479]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.317 187212 INFO nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Took 18.88 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.318 187212 DEBUG nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.330 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6627e29c-a548-4b4e-a737-be263b6a7349]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.336 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[63754ac1-8382-4c73-ab7e-ecc38790081b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 NetworkManager[55691]: <info>  [1764936386.3394] manager: (tapdd355bd0-50): new Veth device (/org/freedesktop/NetworkManager/Devices/213)
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.381 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6b2332bc-2b2a-4780-a58d-c9a3e76d9d22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 podman[226651]: 2025-12-05 12:06:26.385411255 +0000 UTC m=+0.107037479 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.386 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0034bd26-61e8-4dac-b558-4f8f4938800d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 podman[226652]: 2025-12-05 12:06:26.41306148 +0000 UTC m=+0.135572019 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:06:26 np0005546909 NetworkManager[55691]: <info>  [1764936386.4149] device (tapdd355bd0-50): carrier: link connected
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.420 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4446d0e6-0f9e-403e-af44-807dd8b74f1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.443 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7da0cf96-9f7e-4824-9059-bfaec4896e90]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd355bd0-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:03:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376697, 'reachable_time': 32536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226709, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.457 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72050fbc-f1f6-48dc-8d21-7cb2d1945b65]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe20:3ad'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376697, 'tstamp': 376697}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226710, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.474 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5f63e2-b7fc-4e60-b29a-faf011b26c2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd355bd0-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:03:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376697, 'reachable_time': 32536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226711, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.508 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e20e0f-9a67-4745-9e4b-a2162289689f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.577 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7eaf48fa-616c-4a5f-9079-c9f58f8cb867]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.579 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd355bd0-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.579 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.584 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd355bd0-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:26 np0005546909 NetworkManager[55691]: <info>  [1764936386.5866] manager: (tapdd355bd0-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Dec  5 07:06:26 np0005546909 kernel: tapdd355bd0-50: entered promiscuous mode
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.589 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd355bd0-50, col_values=(('external_ids', {'iface-id': 'd5a54702-8e08-4aa4-aef4-19a0cc66763a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:26 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:26Z|00488|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.592 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dd355bd0-560e-4b18-a504-3a5134c930f4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dd355bd0-560e-4b18-a504-3a5134c930f4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.593 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[92c58665-f8f7-427e-9df2-cd285ea698e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.594 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-dd355bd0-560e-4b18-a504-3a5134c930f4
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/dd355bd0-560e-4b18-a504-3a5134c930f4.pid.haproxy
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID dd355bd0-560e-4b18-a504-3a5134c930f4
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:06:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:26.594 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'env', 'PROCESS_TAG=haproxy-dd355bd0-560e-4b18-a504-3a5134c930f4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dd355bd0-560e-4b18-a504-3a5134c930f4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.585 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.609 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:26 np0005546909 nova_compute[187208]: 2025-12-05 12:06:26.707 187212 INFO nova.compute.manager [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Took 19.82 seconds to build instance.#033[00m
Dec  5 07:06:27 np0005546909 nova_compute[187208]: 2025-12-05 12:06:27.045 187212 DEBUG oslo_concurrency.lockutils [None req-bf08c078-c42f-4bde-888e-2b3ce0fb3599 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:27 np0005546909 podman[226741]: 2025-12-05 12:06:27.071770616 +0000 UTC m=+0.120104943 container create 3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:06:27 np0005546909 podman[226741]: 2025-12-05 12:06:26.990574742 +0000 UTC m=+0.038909089 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:06:27 np0005546909 systemd[1]: Started libpod-conmon-3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3.scope.
Dec  5 07:06:27 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:06:27 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/243fd1c5de96a14826aa2f40632d2c7e7d72bd7fcfdbb36dbcf9215a94d0a31f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:06:27 np0005546909 podman[226741]: 2025-12-05 12:06:27.174628093 +0000 UTC m=+0.222962440 container init 3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:06:27 np0005546909 podman[226741]: 2025-12-05 12:06:27.182816539 +0000 UTC m=+0.231150866 container start 3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:06:27 np0005546909 nova_compute[187208]: 2025-12-05 12:06:27.223 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936387.2226713, e9f9bf08-7688-4213-91ff-74f2271ec71d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:27 np0005546909 nova_compute[187208]: 2025-12-05 12:06:27.223 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] VM Started (Lifecycle Event)#033[00m
Dec  5 07:06:27 np0005546909 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [NOTICE]   (226767) : New worker (226769) forked
Dec  5 07:06:27 np0005546909 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [NOTICE]   (226767) : Loading success.
Dec  5 07:06:27 np0005546909 nova_compute[187208]: 2025-12-05 12:06:27.626 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:28 np0005546909 nova_compute[187208]: 2025-12-05 12:06:28.041 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:28 np0005546909 nova_compute[187208]: 2025-12-05 12:06:28.045 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936387.2227778, e9f9bf08-7688-4213-91ff-74f2271ec71d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:28 np0005546909 nova_compute[187208]: 2025-12-05 12:06:28.046 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:06:28 np0005546909 nova_compute[187208]: 2025-12-05 12:06:28.413 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:28 np0005546909 nova_compute[187208]: 2025-12-05 12:06:28.417 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:28 np0005546909 nova_compute[187208]: 2025-12-05 12:06:28.698 187212 DEBUG nova.network.neutron [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updated VIF entry in instance network info cache for port 48b30c48-7858-408b-aeab-df46f6277546. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:28 np0005546909 nova_compute[187208]: 2025-12-05 12:06:28.699 187212 DEBUG nova.network.neutron [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updating instance_info_cache with network_info: [{"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:28 np0005546909 nova_compute[187208]: 2025-12-05 12:06:28.846 187212 DEBUG oslo_concurrency.lockutils [req-484a84e4-458a-4181-9827-ffde645c93d4 req-928e9a97-9f9a-48f6-956c-389e00edbdee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:28 np0005546909 nova_compute[187208]: 2025-12-05 12:06:28.848 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:30 np0005546909 nova_compute[187208]: 2025-12-05 12:06:30.213 187212 DEBUG nova.compute.manager [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-changed-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:30 np0005546909 nova_compute[187208]: 2025-12-05 12:06:30.214 187212 DEBUG nova.compute.manager [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Refreshing instance network info cache due to event network-changed-2064bfa7-125e-466c-9365-6c0ec6655113. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:30 np0005546909 nova_compute[187208]: 2025-12-05 12:06:30.215 187212 DEBUG oslo_concurrency.lockutils [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:30 np0005546909 nova_compute[187208]: 2025-12-05 12:06:30.215 187212 DEBUG oslo_concurrency.lockutils [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:30 np0005546909 nova_compute[187208]: 2025-12-05 12:06:30.215 187212 DEBUG nova.network.neutron [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Refreshing network info cache for port 2064bfa7-125e-466c-9365-6c0ec6655113 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:30 np0005546909 nova_compute[187208]: 2025-12-05 12:06:30.246 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:30 np0005546909 nova_compute[187208]: 2025-12-05 12:06:30.452 187212 DEBUG nova.compute.manager [req-9473d666-4fbd-4449-93b2-245bc47ad0dd req-684fbe08-60b3-496d-89a2-2bf63635c3a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:30 np0005546909 nova_compute[187208]: 2025-12-05 12:06:30.453 187212 DEBUG oslo_concurrency.lockutils [req-9473d666-4fbd-4449-93b2-245bc47ad0dd req-684fbe08-60b3-496d-89a2-2bf63635c3a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:30 np0005546909 nova_compute[187208]: 2025-12-05 12:06:30.454 187212 DEBUG oslo_concurrency.lockutils [req-9473d666-4fbd-4449-93b2-245bc47ad0dd req-684fbe08-60b3-496d-89a2-2bf63635c3a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:30 np0005546909 nova_compute[187208]: 2025-12-05 12:06:30.454 187212 DEBUG oslo_concurrency.lockutils [req-9473d666-4fbd-4449-93b2-245bc47ad0dd req-684fbe08-60b3-496d-89a2-2bf63635c3a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:30 np0005546909 nova_compute[187208]: 2025-12-05 12:06:30.454 187212 DEBUG nova.compute.manager [req-9473d666-4fbd-4449-93b2-245bc47ad0dd req-684fbe08-60b3-496d-89a2-2bf63635c3a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] No waiting events found dispatching network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:30 np0005546909 nova_compute[187208]: 2025-12-05 12:06:30.455 187212 WARNING nova.compute.manager [req-9473d666-4fbd-4449-93b2-245bc47ad0dd req-684fbe08-60b3-496d-89a2-2bf63635c3a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received unexpected event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:06:31 np0005546909 nova_compute[187208]: 2025-12-05 12:06:31.517 187212 DEBUG oslo_concurrency.lockutils [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:31 np0005546909 nova_compute[187208]: 2025-12-05 12:06:31.518 187212 DEBUG oslo_concurrency.lockutils [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:31 np0005546909 nova_compute[187208]: 2025-12-05 12:06:31.518 187212 DEBUG nova.compute.manager [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:31 np0005546909 nova_compute[187208]: 2025-12-05 12:06:31.522 187212 DEBUG nova.compute.manager [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Dec  5 07:06:31 np0005546909 nova_compute[187208]: 2025-12-05 12:06:31.523 187212 DEBUG nova.objects.instance [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'flavor' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:31 np0005546909 nova_compute[187208]: 2025-12-05 12:06:31.548 187212 DEBUG nova.virt.libvirt.driver [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:06:32 np0005546909 nova_compute[187208]: 2025-12-05 12:06:32.032 187212 INFO nova.virt.libvirt.driver [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot image upload complete#033[00m
Dec  5 07:06:32 np0005546909 nova_compute[187208]: 2025-12-05 12:06:32.033 187212 INFO nova.compute.manager [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 9.63 seconds to snapshot the instance on the hypervisor.#033[00m
Dec  5 07:06:32 np0005546909 nova_compute[187208]: 2025-12-05 12:06:32.136 187212 DEBUG nova.network.neutron [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updated VIF entry in instance network info cache for port 2064bfa7-125e-466c-9365-6c0ec6655113. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:32 np0005546909 nova_compute[187208]: 2025-12-05 12:06:32.137 187212 DEBUG nova.network.neutron [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:32 np0005546909 nova_compute[187208]: 2025-12-05 12:06:32.159 187212 DEBUG oslo_concurrency.lockutils [req-837eb906-fb2f-486e-8f5c-57ac946d2a64 req-4c683ea0-5f8f-4cc2-b302-faa32aeb90f5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:32 np0005546909 podman[226799]: 2025-12-05 12:06:32.232038453 +0000 UTC m=+0.086829810 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:06:32 np0005546909 podman[226800]: 2025-12-05 12:06:32.250044476 +0000 UTC m=+0.105079291 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:06:32 np0005546909 nova_compute[187208]: 2025-12-05 12:06:32.289 187212 DEBUG nova.compute.manager [None req-8ad3d67d-4e13-4c32-86df-e4bfc5072282 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found 2 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Dec  5 07:06:32 np0005546909 nova_compute[187208]: 2025-12-05 12:06:32.628 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:32 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:32Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:68:b7 10.100.0.12
Dec  5 07:06:32 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:32Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:68:b7 10.100.0.12
Dec  5 07:06:32 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:32Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a6:47:26 10.100.0.9
Dec  5 07:06:32 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:32Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a6:47:26 10.100.0.9
Dec  5 07:06:33 np0005546909 nova_compute[187208]: 2025-12-05 12:06:33.680 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:33 np0005546909 nova_compute[187208]: 2025-12-05 12:06:33.680 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:33 np0005546909 nova_compute[187208]: 2025-12-05 12:06:33.695 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:06:33 np0005546909 kernel: tap549318e9-e6 (unregistering): left promiscuous mode
Dec  5 07:06:33 np0005546909 NetworkManager[55691]: <info>  [1764936393.7578] device (tap549318e9-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:06:33 np0005546909 nova_compute[187208]: 2025-12-05 12:06:33.769 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:33 np0005546909 nova_compute[187208]: 2025-12-05 12:06:33.770 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:33 np0005546909 nova_compute[187208]: 2025-12-05 12:06:33.778 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:06:33 np0005546909 nova_compute[187208]: 2025-12-05 12:06:33.779 187212 INFO nova.compute.claims [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:06:33 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:33Z|00489|binding|INFO|Releasing lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 from this chassis (sb_readonly=0)
Dec  5 07:06:33 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:33Z|00490|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 down in Southbound
Dec  5 07:06:33 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:33Z|00491|binding|INFO|Removing iface tap549318e9-e6 ovn-installed in OVS
Dec  5 07:06:33 np0005546909 nova_compute[187208]: 2025-12-05 12:06:33.956 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:33 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:33.965 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:d7:ed 10.100.0.9'], port_security=['fa:16:3e:9b:d7:ed 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=549318e9-e629-4e2c-8cbb-3cd263c2bc34) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:33 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:33.966 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 unbound from our chassis#033[00m
Dec  5 07:06:33 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:33.968 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63#033[00m
Dec  5 07:06:33 np0005546909 nova_compute[187208]: 2025-12-05 12:06:33.969 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:33 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:33.989 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7097bb31-9163-4b4e-a712-b3dbd3f187f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:33 np0005546909 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000038.scope: Deactivated successfully.
Dec  5 07:06:33 np0005546909 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000038.scope: Consumed 14.503s CPU time.
Dec  5 07:06:33 np0005546909 systemd-machined[153543]: Machine qemu-60-instance-00000038 terminated.
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.022 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[488531f2-73f9-4ef1-9582-5c7e92db95ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.025 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e0102e0e-0399-4f8d-9ce6-2ab4712fff5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.043 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.043 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.044 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.044 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.045 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Processing event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.045 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.045 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.045 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.045 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.046 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] No waiting events found dispatching network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.046 187212 WARNING nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received unexpected event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.046 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.046 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing instance network info cache due to event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.046 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.047 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.047 187212 DEBUG nova.network.neutron [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.048 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.051 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936394.0514553, e9f9bf08-7688-4213-91ff-74f2271ec71d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.051 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.054 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.055 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0832c860-da92-4076-a30e-953f7dc0df31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.071 187212 INFO nova.virt.libvirt.driver [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Instance spawned successfully.#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.072 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.088 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.092 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.100 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.100 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.100 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.101 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.101 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.092 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1d3aa2d8-80bf-4af8-a9ac-900fa6c37abd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226889, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.101 187212 DEBUG nova.virt.libvirt.driver [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.118 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5ad39cb6-dca0-450a-a268-d43085e329cf]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226890, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226890, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.120 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.121 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.125 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.125 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.126 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.126 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.126 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.130 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.132 187212 DEBUG nova.compute.provider_tree [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.155 187212 DEBUG nova.scheduler.client.report [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.168 187212 INFO nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Took 15.19 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.168 187212 DEBUG nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.177 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.407s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.177 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:06:34 np0005546909 systemd-udevd[226881]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:06:34 np0005546909 kernel: tap549318e9-e6: entered promiscuous mode
Dec  5 07:06:34 np0005546909 NetworkManager[55691]: <info>  [1764936394.1856] manager: (tap549318e9-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/215)
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.188 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:34Z|00492|binding|INFO|Claiming lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 for this chassis.
Dec  5 07:06:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:34Z|00493|binding|INFO|549318e9-e629-4e2c-8cbb-3cd263c2bc34: Claiming fa:16:3e:9b:d7:ed 10.100.0.9
Dec  5 07:06:34 np0005546909 kernel: tap549318e9-e6 (unregistering): left promiscuous mode
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.201 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:d7:ed 10.100.0.9'], port_security=['fa:16:3e:9b:d7:ed 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=549318e9-e629-4e2c-8cbb-3cd263c2bc34) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.203 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 bound to our chassis#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.207 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63#033[00m
Dec  5 07:06:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:34Z|00494|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 ovn-installed in OVS
Dec  5 07:06:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:34Z|00495|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 up in Southbound
Dec  5 07:06:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:34Z|00496|binding|INFO|Releasing lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 from this chassis (sb_readonly=1)
Dec  5 07:06:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:34Z|00497|if_status|INFO|Dropped 5 log messages in last 88 seconds (most recently, 88 seconds ago) due to excessive rate
Dec  5 07:06:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:34Z|00498|if_status|INFO|Not setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 down as sb is readonly
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.220 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:34Z|00499|binding|INFO|Removing iface tap549318e9-e6 ovn-installed in OVS
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.222 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8c40efcd-56c3-45ca-996d-a856ae9cfbbe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:34Z|00500|binding|INFO|Releasing lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 from this chassis (sb_readonly=0)
Dec  5 07:06:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:34Z|00501|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 down in Southbound
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.232 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.231 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:d7:ed 10.100.0.9'], port_security=['fa:16:3e:9b:d7:ed 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=549318e9-e629-4e2c-8cbb-3cd263c2bc34) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.258 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c34f156f-5c4e-411e-ac2d-79b34898f457]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.262 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0f1d7f5a-56ef-410c-bc00-f8374ba86bda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.279 187212 INFO nova.compute.manager [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Took 15.80 seconds to build instance.#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.280 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.281 187212 DEBUG nova.network.neutron [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.304 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f90f95-3dc2-46f8-b43a-4a5b66afb232]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.306 187212 INFO nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.309 187212 DEBUG oslo_concurrency.lockutils [None req-4c3345ba-0b36-4e1c-afc8-47831f815883 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.912s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.320 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c2112ddc-f8fa-4e04-97c4-73a9045e409f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226907, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.327 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.356 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0397192e-0598-4948-a3b2-8534ebb15987]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226908, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226908, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.357 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.359 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.364 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.364 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.365 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.366 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.366 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.368 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 unbound from our chassis#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.372 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.387 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ec5c8000-edd5-41b8-b663-68b0ca9618b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.418 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d25d2ddc-96ae-4cbb-b1cf-5599038892b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.423 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[52f6c2b0-d524-4ea7-a85b-f9105ad44d3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.440 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.441 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.442 187212 INFO nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Creating image(s)#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.442 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "/var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.443 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "/var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.443 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "/var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.451 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a04ccb74-fd5b-4ce6-9abe-711c99c0c225]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.463 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.481 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f9764a8c-0b5d-4128-aefc-d49f7112232d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226926, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.494 187212 DEBUG nova.policy [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '077bcce844cb42a197dcd6100549b7d3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dc1fd38e325f4a2caa75aeab79da75d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.497 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[91157c58-c0d5-466c-9517-128c1394ca18]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226928, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226928, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.499 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.502 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.505 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.505 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.506 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:34.506 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.507 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.538 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.539 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.539 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.551 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.591 187212 DEBUG nova.compute.manager [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-changed-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.591 187212 DEBUG nova.compute.manager [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Refreshing instance network info cache due to event network-changed-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.593 187212 DEBUG oslo_concurrency.lockutils [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.593 187212 DEBUG oslo_concurrency.lockutils [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.593 187212 DEBUG nova.network.neutron [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Refreshing network info cache for port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.598 187212 INFO nova.virt.libvirt.driver [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance shutdown successfully after 3 seconds.#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.606 187212 INFO nova.virt.libvirt.driver [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance destroyed successfully.#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.607 187212 DEBUG nova.objects.instance [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'numa_topology' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.620 187212 DEBUG nova.compute.manager [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.631 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.632 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:34 np0005546909 nova_compute[187208]: 2025-12-05 12:06:34.671 187212 DEBUG oslo_concurrency.lockutils [None req-92fad8d2-7915-40d6-a98d-ee24150d3ef5 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.247 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.302 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk 1073741824" returned: 0 in 0.669s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.314 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.775s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.315 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:35.362 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0d7840929940419ee8ea88b703037ee31cfdee552fea31f4c91af9a9732801d7" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.380 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.381 187212 DEBUG nova.virt.disk.api [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Checking if we can resize image /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.382 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.458 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.460 187212 DEBUG nova.virt.disk.api [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Cannot resize image /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.461 187212 DEBUG nova.objects.instance [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lazy-loading 'migration_context' on Instance uuid ed00d159-9d70-481e-93be-ea180fea04ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.477 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.477 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Ensure instance console log exists: /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.478 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.478 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.479 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:35 np0005546909 nova_compute[187208]: 2025-12-05 12:06:35.640 187212 DEBUG nova.network.neutron [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Successfully created port: d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.071 187212 DEBUG nova.compute.manager [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.110 187212 INFO nova.compute.manager [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] instance snapshotting#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.112 187212 DEBUG nova.objects.instance [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'flavor' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:36.215 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Fri, 05 Dec 2025 12:06:35 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-2365d6d9-9806-4799-9de8-d299053ad6df x-openstack-request-id: req-2365d6d9-9806-4799-9de8-d299053ad6df _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec  5 07:06:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:36.215 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "09233d41-3279-4f39-ac6e-a21662b4f176", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/09233d41-3279-4f39-ac6e-a21662b4f176"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/09233d41-3279-4f39-ac6e-a21662b4f176"}]}, {"id": "dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec  5 07:06:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:36.216 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-2365d6d9-9806-4799-9de8-d299053ad6df request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec  5 07:06:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:36.218 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/09233d41-3279-4f39-ac6e-a21662b4f176 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0d7840929940419ee8ea88b703037ee31cfdee552fea31f4c91af9a9732801d7" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec  5 07:06:36 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:36Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6a:c5:99 10.100.0.8
Dec  5 07:06:36 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:36Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:c5:99 10.100.0.8
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.271 187212 DEBUG nova.network.neutron [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updated VIF entry in instance network info cache for port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.272 187212 DEBUG nova.network.neutron [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.297 187212 DEBUG oslo_concurrency.lockutils [req-5e4cbee2-f61c-45fc-afbe-4b139b3d89d4 req-0e7345ea-62ba-4100-b2a7-3f1d7a45be67 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.302 187212 DEBUG nova.network.neutron [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updated VIF entry in instance network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.303 187212 DEBUG nova.network.neutron [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.319 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.320 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.320 187212 DEBUG nova.compute.manager [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing instance network info cache due to event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.321 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.321 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.321 187212 DEBUG nova.network.neutron [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.398 187212 INFO nova.virt.libvirt.driver [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Beginning live snapshot process#033[00m
Dec  5 07:06:36 np0005546909 virtqemud[186841]: invalid argument: disk vda does not have an active block job
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.558 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.624 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.625 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.691 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json -f qcow2" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.710 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.773 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:36 np0005546909 nova_compute[187208]: 2025-12-05 12:06:36.775 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpnyczfa8f/58866763a26e4d288a94a6228e7ff5bb.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.008 187212 DEBUG nova.network.neutron [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Successfully updated port: d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.031 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.032 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquired lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.032 187212 DEBUG nova.network.neutron [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.050 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpnyczfa8f/58866763a26e4d288a94a6228e7ff5bb.delta 1073741824" returned: 0 in 0.274s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.051 187212 INFO nova.virt.libvirt.driver [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.118 187212 DEBUG nova.virt.libvirt.guest [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:06:37 np0005546909 podman[226963]: 2025-12-05 12:06:37.154205046 +0000 UTC m=+0.066366007 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.171 187212 DEBUG nova.network.neutron [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.523 187212 DEBUG nova.network.neutron [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updated VIF entry in instance network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.523 187212 DEBUG nova.network.neutron [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.547 187212 DEBUG oslo_concurrency.lockutils [req-307fdacd-2775-43be-8360-963b77caa873 req-2eb543d4-2bdc-4788-a82c-56e45671ac29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.551 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.551 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing instance network info cache due to event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.552 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.552 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.552 187212 DEBUG nova.network.neutron [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.663 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 496 Content-Type: application/json Date: Fri, 05 Dec 2025 12:06:36 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-ace03037-28fd-4f02-b3f3-7eb510a85e5c x-openstack-request-id: req-ace03037-28fd-4f02-b3f3-7eb510a85e5c _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.664 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "09233d41-3279-4f39-ac6e-a21662b4f176", "name": "m1.micro", "ram": 192, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/09233d41-3279-4f39-ac6e-a21662b4f176"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/09233d41-3279-4f39-ac6e-a21662b4f176"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.664 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/09233d41-3279-4f39-ac6e-a21662b4f176 used request id req-ace03037-28fd-4f02-b3f3-7eb510a85e5c request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.666 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.667 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e8f613c8797e432d96e43223fb7c476d', 'user_id': '4f8149b8192e411a9131b103b25862b6', 'hostId': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.671 187212 DEBUG nova.virt.libvirt.guest [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.671 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '6d62df5807554f499d26b5fc77ec8603', 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'hostId': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.674 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'name': 'tempest-ListServerFiltersTestJSON-instance-1365452817', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000038', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'e8f613c8797e432d96e43223fb7c476d', 'user_id': '4f8149b8192e411a9131b103b25862b6', 'hostId': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.674 187212 INFO nova.virt.libvirt.driver [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.677 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000037', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '85037de7275442698e604ee3f6283cbc', 'user_id': '8cf2534e7c394130b675e44ed567401b', 'hostId': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.680 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98681240c47b41cba28d91e1c11fd71f', 'user_id': '242b773b0af24caf814e2a84178332d5', 'hostId': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.683 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '85037de7275442698e604ee3f6283cbc', 'user_id': '8cf2534e7c394130b675e44ed567401b', 'hostId': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.683 187212 DEBUG nova.compute.manager [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-changed-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.683 187212 DEBUG nova.compute.manager [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Refreshing instance network info cache due to event network-changed-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.684 187212 DEBUG oslo_concurrency.lockutils [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.685 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5285f99befb24ac285be8e4fc1d18e69', 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'hostId': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.687 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '442a804e3368417d9de1636d533a25e0', 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'hostId': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.766 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000036', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '58cbd93e463049988ccd6d013893e7d6', 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'hostId': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.770 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000039', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e8f613c8797e432d96e43223fb7c476d', 'user_id': '4f8149b8192e411a9131b103b25862b6', 'hostId': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.770 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.775 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b81bb939-d14f-4a72-b7fe-95fc5d8810a1 / tap5683f8a8-69 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.775 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.783 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5d70ac2d-111f-4e1b-ac26-3e02849b0458 / tapac02dd63-5a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.784 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.786 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.789 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 472c7e2c-bdad-4230-904b-6937ceb872d2 / tap9357c6a6-eb inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.789 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.793 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 25918fc4-05ec-4a16-b77f-ca1d352a2763 / tap2064bfa7-12 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.793 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.797 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 297d72ef-6b79-45b3-813b-52b5144b522e / tap821e6243-8d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.798 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.802 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for bcdca3f9-3e24-4209-808c-8093b55e5c2d / tap88c7b630-e8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.802 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.806 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e9f9bf08-7688-4213-91ff-74f2271ec71d / tap48b30c48-78 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.806 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.813 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 24358eea-14fb-4863-a6c4-aadcdb495f54 / tap2e9efd6c-74 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.813 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.813 187212 DEBUG nova.privsep.utils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:06:37 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.814 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpnyczfa8f/58866763a26e4d288a94a6228e7ff5bb.delta /var/lib/nova/instances/snapshots/tmpnyczfa8f/58866763a26e4d288a94a6228e7ff5bb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.817 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8888dd78-1c78-4065-8536-9a1096bdf57b / tapc5cb68aa-e5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.818 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c48d5a33-8e3b-4485-b17c-d4b9de0e41d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'd9e9415a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': 'e62dbf65ef2d06872dcfea3e9125814d7dc4b0161c8976b8ab08428d3bb41c42'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'd9ea8894-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': 'b29590fcfcbf4a8b2c8c6000c6397afdaca0e68ec518201e04a8a3f9674ccb52'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'd9eb4608-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': 'c2053d974d6485f604d6aa08fc4720aeaefbc2dd1902cc1f05759011dafa2cf8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'd9ebf0f8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': '636e15e50014ebbf03b763aded8b952669ba6152a68d2ad723829b6571894c25'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'd9eca2fa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': 'edd71c0e5e7458a763903fa2dc7be7e661963774ad781ace042f4a8a4a3fa56e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id'
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: -05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'd9ed5038-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': 'c56d71772530d8474b93cbf25f3006ff8b270dc4590da2b4fae14bfb3031c4f4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'd9edeef8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': '2b90f6ac461b51278d6ad7279a6922141a97ebb4b3a0b1de58cf98a2ea24ae96'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'd9eef10e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': '942e38d850feac17dee95db273004d1fcaf10dc0633b58a677e23d6297f730e8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:37.770825', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'd9efaa0e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': '23163a3b0e08bfff9d4f3062cbf3fba3dc603afcf80b031cd7d7e4587a1e6add'}]}, 'timestamp': '2025-12-05 12:06:37.818724', '_unique_id': '3b9e28d5f3fa449380ba59f944a17b02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.821 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.821 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.822 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.823 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.823 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.824 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.824 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.824 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.825 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.825 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.825 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdc3f1af-ce9a-4aa6-9c35-ea70b3f19ac5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'd9f031b8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': '7e5beea0b3bd7e7bc42a3854e74b9472ff698a0d23b73cfb0e3e48e5f78a2482'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'd9f03fd2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '14563c3dc2c702fbcc20fe9ca5de3d4105bb760cf1a8bd8de1b899af1df88fee'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'd9f07a92-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': 'a21a759062871747ae65544b60c1216e4a7ff1d1a9784cd453e9bb7a432734ac'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'd9f08b54-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': 'cd93397b619f9723cdd58c6c16aeae2e2ef1480dcbf524a2a2b2321aab30494e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'd9f09aea-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': 'ca509850007a9355008d165dcbffd6dff2353fe84cf2c5266487550626214212'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88c7b630-e8
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: achInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'd9f0a92c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': 'ba5d4f51d4bb7bce8f42ce64c780436d57acedf4127f636783cf43afcc3edade'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'd9f0b502-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': 'e59efda236c11c7febf6fb079e91c0cc047eaa6b5077333a242a5789a57bc707'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'd9f0c416-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': 'e7ddf023b894eacbe3879484afabeb17e52ae402469ba8ab40005bbc5d70bbb2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:37.821738', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'd9f0d2a8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': 'a4f23fd44ad5f0b6135e2a5efff0cb759f3c1b16d2d8609ef8cd4aedbe7fa80b'}]}, 'timestamp': '2025-12-05 12:06:37.826280', '_unique_id': '99842ab20c2546ce810a2309b61e4477'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.829 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.829 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.830 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.831 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.831 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.832 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.832 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.832 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.833 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.833 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.833 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f81d009b-26d8-4201-9eab-7faf8713bcb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'd9f1651a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': '834d5d97067c22c33606ccadc307515b58e36aadcc2a1b55d11dc6aa0a346f5e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'd9f17604-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '8b73d82d3f31cea02cf632f5d1471aef7cd748433a8d06260c67e9c1cc0df3fd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'd9f1b402-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': '9be6f0c67c24a56c64e0b7f9234ed8639b5c107e475dd21405e10b49e8316b0e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'd9f1c550-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': '376e901cef0430b04385728518d642d76e46ade79307bf6a06976ba40942b4fd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'd9f1d0c2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '9b8356ef7fff6f92bc114212e110ac66c5052e718a91b7fd6c494e06aabdaf32'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'ins
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: :06:37.829548', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'd9f1e08a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': '4b7a65c2d9589e393707231b9e4ae6e6b04c032382f424a6feae683db470e5b6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'd9f1eeb8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': '9fdd0c782c14a5246b1ed2d1e9396c6bf1d352432e17ca5579767b2a6c0b52cb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'd9f1f9d0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': '4a0afffab8989a902de80b71bf23ff3fed62843d66864921c8569daec1ff28fc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:37.829548', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'd9f209b6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': '3ca7b82f625e62ecfbca5367494b426ff3be95ae3ac02892aa6f9ddd293ea127'}]}, 'timestamp': '2025-12-05 12:06:37.834240', '_unique_id': '572c8d5fcfcb40a0a46f7f8f4d5c84b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.837 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.849 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.850 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.863 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.863 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.865 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.878 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.879 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.892 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.893 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.905 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.905 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.918 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.allocation volume: 29237248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.918 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.931 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.931 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.943 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.944 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.959 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.959 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2efa5488-1e6c-4c2c-a349-a3205641f3f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9f4778c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.45874598, 'message_signature': 'e2cbbd561a7a15deab89449919dffdc91a6ca5b41729325246f75858252264bb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9f4872c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.45874598, 'message_signature': '63faed4943498561b63aaa4334f5fba59961f21d40f45d0617a0e32cb01c508e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9f68df6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.471626794, 'message_signature': '78124a62242b797be892bee511d039eb0e6ec3ca8d5c70cc8cc04c321d8ff6cd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9f6a020-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.471626794, 'message_signature': 'c333cf579407119fd160fa611775aa787594bfed8218aff62318baa460e4b005'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9f8eaec-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.48629558, 'message_signature': '481fda08c6a5417983d80d3eb27811b55ce5f63a003cb2db39be4978174d079b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9f8f6ea-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.48629558, 'message_signature': '7553d31d41a76d81f01aea0323cc698eea550a4d30aadab482ed3f9fa6353dc7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9fb0ade-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.500617576, 'message_signature': '6b4567b728212a1c9d33752acbaad2a7988d83c8417c5d828127ca0bb07b8577'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9fb16dc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.500617576, 'message_signature': 'e0cde58f24eb6bb4c9d76442eb5e95bd3f7fd60d41a3d09a0775c79527b49006'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9fcfc5e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.514535659, 'message_signature': '9d46b3c8b57eb1e51258ca2fc88f426ff3b7f14fb61a9a39632de1a4bca2f486'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9fd0744-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.514535659, 'message_signature': 'e97ff2ae98a5e9ace515af9c02d9de3448ab1254350770c0d2756ab85cd72337'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29237248, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9fee96a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.527239948, 'message_signature': '761ca7d16af80d82dcdb3338fb04617cf329f8a59cc04530405ac8c369517e60'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vc
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: message_id': 'd9fef3e2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.527239948, 'message_signature': 'a423e91cf46cc30e9878c30fe76c1fe9860b8b3efba727807f8c6d781b7c08db'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da00e6c0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.539853934, 'message_signature': 'eec7066328e89ff3897294c2bbf391f557fa4a6ef735c8ed29d6e0c33381c250'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da00f192-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.539853934, 'message_signature': '809ff3176ed24408bc7d75e73b9f187e0b94bac5fffac6f79f1b303475d815dc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da02d32c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.552883452, 'message_signature': 'bc235102f6e298f75beca6371c7c538b42cec9f177a970438b7fa96cdd9ebeb2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da02e100-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.552883452, 'message_signature': '655e7f840ace546bb2fb25c2ff2c78c6d0f836af73b4e91cec79e1f3b1e390d7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da053194-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.565594991, 'message_signature': '688f420812f6908041f8f666fdec7f0b16a3bf9466fb1b0f85017d160abb0263'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:37.837675', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_na
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 991, 'message_signature': '692b8d0ca9c10fb8532f298990bd433ed011ce7a79751839a757003c15e8fa42'}]}, 'timestamp': '2025-12-05 12:06:37.960045', '_unique_id': 'dfd8b1b37df3477d8a980dfe180dc0ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.962 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.incoming.packets volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.962 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.964 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.964 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.964 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.964 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.964 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.965 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.965 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.965 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.incoming.packets volume: 22 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b3845cc-15a5-475a-b6af-b712e46793a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 20, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da05a9a8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': 'dd081b8beb7931cc913f649b1699bc82b86b9ce5babbba37f4c08ccdfcc0ee80'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da05b22c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '11ea683ae7af00f2ce46617f9da93877af76d89869588c2bc3cdffaa1680f2cf'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da05e7e2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': '905837230251bd64434fa955172616a5650f4064549ff559152bab7fbd3b271d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da05f03e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': '318c8d22631c02497d850ee0932a67f017fdc58facb23298980da12fb2f3cbe6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da05f822-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '51a2d9be47007469724f92a22140dd77caa2622cef623b4e06f5d0d2cd73020e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: _metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da060290-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': 'fe1a14da30884607b00263c5c5767c11c631c27c7f425b461acacae767202569'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da060e16-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': 'ad534aa1c21e2a1dc61c963fab0729aef9214ad01b063d85fe03974ae719911d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da06176c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': '4c9c334d27913233966d18d2fd76873c3af59220ab11c082906e1219ea2157df'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 22, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:37.962521', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da061fc8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': 'fb21937107a860f932e598d29f472494b89ed095b04cb286cc80b1a1e90c0b71'}]}, 'timestamp': '2025-12-05 12:06:37.965822', '_unique_id': 'd7148a875ffa460abb956b1ff0ea68e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.967 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec  5 07:06:37 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:37.986 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/memory.usage volume: 44.96484375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:37.999 187212 DEBUG nova.network.neutron [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Updating instance_info_cache with network_info: [{"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.002 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/memory.usage volume: 40.46875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.003 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.018 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/memory.usage volume: 42.640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.020 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Releasing lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.021 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Instance network_info: |[{"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.022 187212 DEBUG oslo_concurrency.lockutils [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.022 187212 DEBUG nova.network.neutron [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Refreshing network info cache for port d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.025 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Start _get_guest_xml network_info=[{"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:37.820 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:37.828 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.043 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/memory.usage volume: 40.43359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.043 187212 WARNING nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:37.835 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.048 187212 DEBUG nova.virt.libvirt.host [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.049 187212 DEBUG nova.virt.libvirt.host [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:37.961 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is:  1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9f8f6e [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: message_id': 'd9fef3e2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.5272 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.054 187212 DEBUG nova.virt.libvirt.host [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.055 187212 DEBUG nova.virt.libvirt.host [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.056 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.056 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.057 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.057 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.057 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.057 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.058 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.058 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.058 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.058 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.059 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.059 187212 DEBUG nova.virt.hardware [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:37.966 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.063 187212 DEBUG nova.virt.libvirt.vif [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=64,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATHw79fzCFS1LAWHUiavQB3gUFaXpS81QU/Ce6wZ4HmvTj5LBGoan0DqDckMccItIq/MaTr8w95EnUae9L4Bz4KldjVTS0oi0uLUNfFAJiLjBukcvGPiZbx9R9d1EWHww==',key_name='tempest-keypair-599091465',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dc1fd38e325f4a2caa75aeab79da75d3',ramdisk_id='',reservation_id='r-cei648o9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-303309807',owner_user_name='tempest-ServersV294TestFqdnHostnames-303309807-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='077bcce844cb42a197dcd6100549b7d3',uuid=ed00d159-9d70-481e-93be-ea180fea04ba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.064 187212 DEBUG nova.network.os_vif_util [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Converting VIF {"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.064 187212 DEBUG nova.network.os_vif_util [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.065 187212 DEBUG nova.objects.instance [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid ed00d159-9d70-481e-93be-ea180fea04ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.066 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/memory.usage volume: 40.4609375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.067 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.067 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.068 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.083 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/memory.usage volume: 40.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.087 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  <uuid>ed00d159-9d70-481e-93be-ea180fea04ba</uuid>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  <name>instance-00000040</name>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <nova:name>guest-instance-1</nova:name>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:06:38</nova:creationTime>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:        <nova:user uuid="077bcce844cb42a197dcd6100549b7d3">tempest-ServersV294TestFqdnHostnames-303309807-project-member</nova:user>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:        <nova:project uuid="dc1fd38e325f4a2caa75aeab79da75d3">tempest-ServersV294TestFqdnHostnames-303309807</nova:project>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:        <nova:port uuid="d10caa85-dfcd-49ce-8ff7-2c2a68d1d731">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <entry name="serial">ed00d159-9d70-481e-93be-ea180fea04ba</entry>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <entry name="uuid">ed00d159-9d70-481e-93be-ea180fea04ba</entry>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.config"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:cc:8d:e9"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <target dev="tapd10caa85-df"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/console.log" append="off"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:06:38 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:06:38 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:06:38 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:06:38 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.088 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Preparing to wait for external event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.088 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.089 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.089 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.090 187212 DEBUG nova.virt.libvirt.vif [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:06:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=64,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATHw79fzCFS1LAWHUiavQB3gUFaXpS81QU/Ce6wZ4HmvTj5LBGoan0DqDckMccItIq/MaTr8w95EnUae9L4Bz4KldjVTS0oi0uLUNfFAJiLjBukcvGPiZbx9R9d1EWHww==',key_name='tempest-keypair-599091465',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='dc1fd38e325f4a2caa75aeab79da75d3',ramdisk_id='',reservation_id='r-cei648o9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-303309807',owner_user_name='tempest-ServersV294TestFqdnHostnames-303309807-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='077bcce844cb42a197dcd6100549b7d3',uuid=ed00d159-9d70-481e-93be-ea180fea04ba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.090 187212 DEBUG nova.network.os_vif_util [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Converting VIF {"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.091 187212 DEBUG nova.network.os_vif_util [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.091 187212 DEBUG os_vif [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.092 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.092 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.093 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.099 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.100 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.100 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.100 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.100 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.103 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd10caa85-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.103 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd10caa85-df, col_values=(('external_ids', {'iface-id': 'd10caa85-dfcd-49ce-8ff7-2c2a68d1d731', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:8d:e9', 'vm-uuid': 'ed00d159-9d70-481e-93be-ea180fea04ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.104 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.104 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance e9f9bf08-7688-4213-91ff-74f2271ec71d: ceilometer.compute.pollsters.NoVolumeException
Dec  5 07:06:38 np0005546909 NetworkManager[55691]: <info>  [1764936398.1062] manager: (tapd10caa85-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/216)
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.114 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.115 187212 INFO os_vif [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df')#033[00m
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.125 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/memory.usage volume: 42.59765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.144 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/memory.usage volume: 42.6796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e3addb1-ba85-442e-aad8-0d820d082763', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 44.96484375, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da094b12-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.606937381, 'message_signature': '3118c861f074d266b0ea45aa5a2844d88abfb36a6dc7f1a18ec48a94afdae88d'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.46875, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da0bbb86-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.623031778, 'message_signature': 'b8b5d31784110a44e7e2b2c7ec65aba44a02fbf6f48143c94387529405957935'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.640625, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da0e3032-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.639152616, 'message_signature': 'b5fe262bfffb552279a9f7a25f610865aea361436ca0509ac5648e51d736e6e2'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.43359375, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da11fd7a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.664076019, 'message_signature': '3856babe968395abf9103e4f2a3d942215a4f2448ca3d28f4f48b4026f562852'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4609375, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da158738-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.68719179, 'message_signature': 'b64435b6a1c12925640aa1e012c56cffd28a6b9e36d2d1e5ef7980a6908e0931'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.38671875, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da183e6a-
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: b48ce7f8c02419f1ed91413f0e4a7718d4727fc45d6346c63758d7'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.59765625, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da1e85d6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.746155211, 'message_signature': 'd75b0be015532be3bc7e9223ca70db24040dff2aa06ee10f92026d563074ce17'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.6796875, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'timestamp': '2025-12-05T12:06:37.967586', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da216ca6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.765158702, 'message_signature': 'aa5799b0ecddebf571031d10e148e490283edb51b6cd37198d6981df749dd51b'}]}, 'timestamp': '2025-12-05 12:06:38.144850', '_unique_id': '43c3a5d04ec94f78b9acb19b93db0c42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.147 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.147 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.outgoing.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.148 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.148 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.149 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.149 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.149 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.149 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.150 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.150 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1dce17ff-3de3-415a-87bb-ce5d594b5ab1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da21d88a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': '7f7f9070f8561380076f98097de48e2693f60f3aefb732ad7163e3c71043f9ec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da21e41a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': 'ae296f00c5187bd93672b20cebc865b7931d8ce6f9a3df4ddc55032b52d82aaf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da221408-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': '87d7552d491b1d60d9d288f5cef36e2c10ee0d668b6d2db84c8b42808b1d1cf3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da221fa2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': '1ed61acdf5938fcc85adac9e5bfe5c903d9a4570553eaef7ba78b78fd197b140'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da2229e8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '08665768e3330293b2ae756fc8f5cc632080c1a42cfa90a150392b1a2312bea5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da223492-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': 'f4fcd3284db8c9f76892817f525e427f32c47c38b51a1d5e55b4cb667471ff75'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da223fdc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': '4d161bb5adf601fc5783f99281b858c717a86b6643e8aa553028fa003fddf765'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da224a86-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': 'c8e65dcc1b97dc7505ed596fcc78e4e238a63083c0ded76a332767fad8c473b0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:38.147207', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da225486-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': 'f0cc58717e842520999e1152c76320e76fe5aede3e7be2972a29e58b546ef910'}]}, 'timestamp': '2025-12-05 12:06:38.150708', '_unique_id': 'b4a0d70801ec47948370026045fc335f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.152 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.152 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.153 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.153 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.153 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.154 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.154 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.155 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.155 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.155 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.155 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.156 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.156 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.156 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.156 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.157 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.157 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.157 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.157 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.158 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '174acca3-093b-4f08-94ba-95d3024c9d02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da22b52a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.45874598, 'message_signature': 'b8fb840b2e64e728cadffc5c6ac04f18389a87fb71f1105d6f5711065137117f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da22bfde-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.45874598, 'message_signature': '274decb854b14e3292e7383ebe4cf2547ae6d928b7bb38cab66a8977668730e8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da22c9ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.471626794, 'message_signature': '914b799a2f3e146fc7285e70d7fc1e064bbb788c1160ac8936cd8f6591aafe55'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da22d3c0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.471626794, 'message_signature': '420178c144c3ea0c7d262c8511b3702abb2222275e30ecc6f98a92b72c768633'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da22fd82-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.48629558, 'message_signature': 'f8077042dfe204873d580ca0cd08ee0496cec079b7dfddc3c3d3952ef6b192d7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: b': 1, 'disk_name': 'sda'}, 'message_id': 'da230a5c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.48629558, 'message_signature': 'f51ac98e8f3b9ce3ff10b37915346645151b35ddfa4d620e32523ec21be654f2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da23140c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.500617576, 'message_signature': 'fdb66cbd7f97afe0c8a712a21e8b60acb72b2d683b9abee802b7c961b4b7d83a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da231dc6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.500617576, 'message_signature': '790246eae059a2ff2f2a84fb6b03b122f857a6c14fbc9792ae07f12b409724e9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da2327da-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.514535659, 'message_signature': 'acc777fe84828fb24217621c4262946af76c190fd118a7956f061afbc138dd95'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da233284-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.514535659, 'message_signature': '1503a68dc5e32234a681170c7c3cd4944b52e3b80a4970feefbc1abbaab2886f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da233ee6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.527239948, 'message_signature': 'fcd0df19fa03bcfbca1f8754b8f945e00f8bd9cbb03e9e5f2fba9d2ff2f05707'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: nic_time': 3778.527239948, 'message_signature': '26f146b58e1ca66d291fc811eea7d620660d2f543e1a40a15f3fd8e2cec246b3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da23550c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.539853934, 'message_signature': '754ba1a53031bf04583c9ffc0f6de6090a8842513b085e333daee00c61365be5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da235ebc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.539853934, 'message_signature': '5a8ee1d49dd87e490c3b1bfa34d01589ebe5080db8cfe224c9f1f607c3cba41f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da236830-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.552883452, 'message_signature': '24316d3399557a241374a16a4482ca4db39efd632b817273a66178fd7173c1a4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da2371f4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.552883452, 'message_signature': 'afb0f3aeac1cb070279f847b69648f0a04f3494814f6b2dda72c7fbdb9744e65'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da237c94-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.565594991, 'message_signature': 'c8eee073d4ce2a9e0504cec75ebe07bd7ff4cd3a7383aa3ce02484df4e321ed3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.152850', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da23861c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 6'}]}, 'timestamp': '2025-12-05 12:06:38.158523', '_unique_id': '8899e3de0ad44e8db24bcf2349e26826'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.160 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.161 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.161 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.outgoing.bytes volume: 1194 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.162 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.162 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.162 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.outgoing.bytes volume: 1396 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.163 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.outgoing.bytes volume: 1368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.163 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.163 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.163 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.164 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af260438-7f2b-47a4-a6b0-e66f241dcf92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da23f4e4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': 'd944c4bea5b837de1081ae6fd862cb1ae23983d97e9fc5afaec1d920d895db52'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1194, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da24004c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '560afd91ce48ef32f61c55b7a9d58c3b101143dcbd9b20e413d5761a6eda513e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da242f2c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': '29437fa858fe79c8706f280121f235b859472d1370390e18e1ad767f9f755a52'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1396, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da243a26-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': 'dcca929856bdd6efb79a56b9912d3028087664066ae6411a1298a3619f13a97c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1368, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da244552-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '95345139cbbc19efbc2f9b5f6efdf111baad4c98821f018850fa867e237cb035'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: mpest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da244f8e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': '37fe8e46403dd3ce67983c09c1bedcc0e400b2e85cd732d22d1b50125852d27a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da2459e8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': 'd31742fb53bd6465d382a233051d948d0091be57c1fb49e3d1dd53d8c7a727bd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da2464e2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': 'c0a3091078fa12a92330f122638007c6c530414d0a391c2b746efe3077abd97f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:38.161033', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da246f32-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': '7ed2adfeb9a625cd710631f09e67db50d8530ea332dda717a575f394ccf2f011'}]}, 'timestamp': '2025-12-05 12:06:38.164518', '_unique_id': '0fd1aa92f2c54d0ebff12368200d8055'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.166 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.166 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.167 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.167 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.168 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.168 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.168 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.169 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.169 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.169 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.169 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.170 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.170 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.170 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.170 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.171 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.171 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.171 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.171 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c9d39c2-eeb3-4243-aa2c-3dc402783af5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da24cfb8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.45874598, 'message_signature': '3a602cc9abc61d839620c59e858b3e0265b58bf3f71a70426367779f07469e2e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da24db02-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.45874598, 'message_signature': '1912db51bfa659853833e234430d46ffef96c3dadb8e5b10527dd587f43ce78f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da24e4da-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.471626794, 'message_signature': '2232da64dbf2a91d46dc4c9d8432d34c307458984741b887c92af83568da6c5d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da24ee62-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.471626794, 'message_signature': 'f208bf2ea970b413154d163ca6ace4a86446a2e8929e99922f6a42ceffb82010'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da251e3c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.48629558, 'message_signature': 'eda559a9bdbd1910b84ffec3f852d2877313e74da82af07948dce25ea741ad04'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: phemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da25290e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.48629558, 'message_signature': '6d5bb07c4a6d4ebedf34b77465396d9dd09348a6d066e996cbb12b773947cd97'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da253278-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.500617576, 'message_signature': '7e6383bdfd9d2d0bbd90572db4750c4e5b971df7ae6ab63b8b09b8cf6b45c6e9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da253b42-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.500617576, 'message_signature': 'd86841d772bc3968ebe454949d94c19a027aaf74d76a55c7425530fb1bbb03c7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da254542-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.514535659, 'message_signature': '3b9f297834832541c21b6e3a3d7fd89f707c7b07fd069ec34d658ffb74f1d251'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da254efc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.514535659, 'message_signature': '0c4ba25a10ff1912eeebf87856e0919d73f5db046afff01e2518be8bec3f1648'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da2558ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.527239948, 'message_signature': 'a4d4c3e7bc7874e5939fb5f111bfb3a64e60ec2dcad35f4f6f685f6be6c0e606'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'me
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 'da2562ac-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.527239948, 'message_signature': 'e9ca3ab3aa0bc8f5d2d40479d992f092a962bf4524271a1e23745ec3cc0c96f3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da256bf8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.539853934, 'message_signature': '502ee0e147eef004d4516edc0143ae5d9e9de8339cef8a451811230c05277f95'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da2575f8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.539853934, 'message_signature': 'fc851ce5bf0ae3f2307be5e631b1389426548b3d11e014f75b86caf32c329b01'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da257f58-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.552883452, 'message_signature': '7a047fd1359346e17a2916280d602d8f7c839d8b4efeeaa6fa0f95467e0962eb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da2588a4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.552883452, 'message_signature': 'cd2629b41495b66d5fd55eb0fe6045e40071dd03c50aff9a169cae0bd7d9c6d2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da2591dc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.565594991, 'message_signature': 'a67a902f0f5eff7e1a4322a5096b470a3b5fa636601c90966de762289ccaa190'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.166659', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'me
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: gnature': 'cbc906ebecd0a4fe330ac16e9af25c290a9ce9c1228d58ecc928cba34ce4d277'}]}, 'timestamp': '2025-12-05 12:06:38.172172', '_unique_id': '2638dd16b78c40909453df7198aa101e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.174 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.174 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.199 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.write.requests volume: 380 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.199 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.232 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.write.requests volume: 300 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.232 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.233 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.258 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.write.requests volume: 314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.259 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.284 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.write.requests volume: 288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.285 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.299 187212 DEBUG oslo_concurrency.processutils [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpnyczfa8f/58866763a26e4d288a94a6228e7ff5bb.delta /var/lib/nova/instances/snapshots/tmpnyczfa8f/58866763a26e4d288a94a6228e7ff5bb" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.146 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.305 187212 INFO nova.virt.libvirt.driver [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.151 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.315 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.write.requests volume: 299 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.316 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.159 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: b': 1, 'disk_name': 'sda'}, 'message_id': 'da230a5c-d1d2-11f0-8572-fa163e006c52' [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: nic_time': 3778.527239948, 'message_signature': '26f146b58e1ca66d291fc811eea7d62 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.165 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.173 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: phemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da25290e-d1d2 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is:  'da2562ac-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.527239948, 'mess [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.338 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.340 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.341 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] No VIF found with MAC fa:16:3e:cc:8d:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.342 187212 INFO nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Using config drive#033[00m
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.361 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.write.requests volume: 232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.361 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.376 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'flavor' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.401 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.401 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.412 187212 DEBUG oslo_concurrency.lockutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.412 187212 DEBUG oslo_concurrency.lockutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquired lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.413 187212 DEBUG nova.network.neutron [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.413 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'info_cache' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.430 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.requests volume: 317 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.431 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.458 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.write.requests volume: 278 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.458 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f074ed24-c051-437f-a600-eff335bd0f54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 380, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da29d224-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '881577464ff67a8faa98087569a1a1658c1442b3ee6e39e50709321109653f1a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da29e138-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '3b94fd97245146b8ffb619e1206eb6f8dc617fdfdf9b82ed8cbc9009f1f2a2d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 300, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da2ed6ac-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': '5f8445008d26050b3fc574106526b78fcaa37d917e8838c4a720bf24e9fb2a81'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da2ee44e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': 'fd0a442051750bb47ec07f4d3ce183313e5ffc6184c80349e024ce6bbc421ecb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 314, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da32db9e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '680330195e0cd353b6cecc593c01c0b7bb23b48902ebd1ab685459388c839213'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'im
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: ', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da32e6ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '5fcdc5a5b45e143d80ace757d996fc9a4469f27aa5e783dd5f5f25d1878eab45'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 288, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da36e0ae-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': 'cb5f485c055f67b4cb4db370dc9c7bf89c51fdeb061279697a2d4c4aded79b7b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da36f404-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '6312fb80274a6a678033232189ac3d107d80dc21f4546c950508254594fcb49c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 299, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da3b9176-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '3a1348315d5f5f62f121465b2987a4666ca5e15559f9711f8647d5e0d759c88d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da3ba5c6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '40c43f6c4bf2a0daffe01191a251fd4d643ac9970268975293a516d91e12883e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 232, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da428e86-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': 'ebfc046ee61b3749753e389042b28d876df5ba48f1c667dc96e35f949f29a8c8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'im
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: ', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da429d36-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': 'fdaa812690cd86bc97aecf29a9a07e00100064398768e4c1525f22124384564a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da48a4ce-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '24a2dd73dd89496afff1cd42c708aa188b669f69b3db9c12e17fe7c5f67a832b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da48afc8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '207d0084c3a97a15c63a6998003ccde7e8a9d3810d5e8ee0d98d7c85bfb2aae4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 317, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da4d1b80-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '352608a539b824669d798f2f199dfda027a3cf121106395d07f55621a943dbde'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da4d26ca-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': 'ae1bc3e1932fb4e999e896f9b4ca475ba019f3c59a6383b80c717a704d2c3350'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 278, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da515574-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '0d3fb12e3ae71cc9da53d079b01e692e05555beaa071da32649465a91970f51a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.174916', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: pus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da516320-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': 'e58770076d80c30cbf90a879dc016fa096844ab2c9e7ee3bca1c0478a976ca04'}]}, 'timestamp': '2025-12-05 12:06:38.459253', '_unique_id': '18c9c4df3784464487cec060657b5665'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.462 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.462 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.462 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.464 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.464 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.464 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.465 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.465 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.465 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.466 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.466 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f307985-ac56-4936-a793-00fff57dcae3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da51f1c8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': '2f9b98dc945be1f3ef5e27028aed65c297ea684c28db63d281ace0b385fc3f07'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da51ff9c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '03945b2680c718a15ccb8744ada8cec228fd14e56295e1686de01e964c849a9d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da523dc2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': 'f6ff2f6c8206b2994322c32c0ca4d9c5dcf4a35aff044229a302dca45ae33eba'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da524a7e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': 'dad0a419e2109698c7f52089655a12995495d0936bb4458efe3eee06ff1462f3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da525df2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '13eeb742ce74eeee19f02d42738ab824c802c6abc65ea42085b9d67b2ef4df71'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'ins
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: :06:38.462401', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da526892-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': '5023f7889213e72254f839bca250742e1ed92a6a23511d89a98e7f1cbd640e73'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da5271b6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': '6649680740cfcff854b7eda3a95f4f278ee5df12e09031ad9c52253e057a78da'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da527d14-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': '1edca1b40fdb9f3c8d1b53c13410200e4297d360234641cef439c5ddaa95107f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:38.462401', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da528f8e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': '6a309efbfadae74bcb8b3f87559d1244dc87deaaba5f5a2336d55e036d37f7c8'}]}, 'timestamp': '2025-12-05 12:06:38.466788', '_unique_id': '0e1df6ec9e7d40ef975021892b58970c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.460 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: ', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: ', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.579 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.579 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.579 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.581 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.581 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.581 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.581 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.581 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.582 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.582 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.582 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e0ca005-bec2-44d4-96c0-9a577f633adc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da63d320-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': '1934e9a37cbac1a80b55cc244ce31d3c0ef6e062378df87e5463d4d8c930e823'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da63e018-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '4699dd4f40e0f6705f1ea52b80006c1b8bd2f1db40df12dd9b8c4af108ab1a87'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da640f02-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': '26cd36e48f490023a9542b0d4e0f01c7dfa8f2271fd47c8b56a7ef1e9c325e48'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da641768-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': '70ccc423012816323018100a6242ca4b037908f88c084029d6a9b9df6684c828'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da641ff6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '2558964e1ac4818308f550898060774491df49b6f8ae363c1ce04a5ae65bef0c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap88c7b630-e8
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: achInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da642906-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': 'a1afd76e076d7988e0a54917245a44499a75c120ddc6a53e6622906ab9d2825b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da643248-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': 'aaa4cd224a2d12964829f6961ffb2b3a6c36a4dcf187ce333f7459edfa1afb9a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da643a18-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': '2553e717ce387c292d185337ea9588c95f211ebe9b366313c6decce1efd10ceb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:38.579567', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da6441e8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': 'a889164b624332d50dc3a6224f2ca84ff96ea44a1b4105566319750bf4cd4909'}]}, 'timestamp': '2025-12-05 12:06:38.582736', '_unique_id': '2cb39730261349bdb88dec18a4492b8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.584 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.584 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.write.latency volume: 3600348889 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.584 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.585 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.write.latency volume: 16389433442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.585 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.578 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.586 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.586 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.write.latency volume: 3570009297 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.586 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.586 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.write.latency volume: 3788709110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.587 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.587 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.write.latency volume: 4092721146 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.587 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.587 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.write.latency volume: 4788852769 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.587 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.588 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.588 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.588 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.latency volume: 5295983790 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.588 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.588 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.write.latency volume: 3506766134 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.589 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72744bd6-19e0-4bec-b9d1-df6a3d6e0deb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3600348889, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da64951c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '68313a01fff7dd2a9af7b867f0b4141726c6c2c1ce25c4a7eb689bf0a9ec9f52'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da649daa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': 'ed1c80da3d74499e0a5580d36f50d8b0e954bb0199572d4bd523aaf99ae3b15b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16389433442, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da64a85e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': 'a862b80a56d2cad757e64aeb95452a54ed3f00f03f97e685fa8764815eaddc49'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da64b010-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': '9a82d068d8780ee520922a58d8ce6e9c826e84d1c2dd2e3bdfe97060e16dd6ed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3570009297, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da64d9aa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': 'bb832b50d5ef1d92471d31f761d26040fbc5d652f011a9a7277d17fd49d11290'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a69
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da64e1f2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '9db56b271052870e2b197d891e651b3b9476b6f5a272d1c32a126ce5e353ed46'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3788709110, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da64e972-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '7500b63040199368fdba8cedc87da211e3c8b7908d6714bc7d2647e1d9851d8c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da64f19c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': 'c499ce6ca27d0127fe3f3e79f2b2d6facf7b4b355973458c16757cf0fd58e197'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4092721146, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da64fb38-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': 'e4043307c0244cefe56007aab657fe74d5a6da43d8ded0fd71595a6fb76960f8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da650290-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '2eb2cd3ff7369b2c0bd931a49503cfda015686cf1e87a9e1060bc34f0bc9ba22'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4788852769, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6509c0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': '4e188b3ce4c0c60152331bcf421ea6443462d07ccd7a19b3a8c560ce7d833788'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6510be-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': 'ea68c80cbd89f5470f692cea6e0ab335d4e1a54341b53bc30474b47b2d11e22a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da651898-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '3bd413ac6c83274db8a5d936557a4186c419080a0da3647c5d91cb8eec994a20'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6521e4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '053d963794e8b3bd649bd96677f87ac94fff5f81b10e9bc61e1034b3d1724d1f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5295983790, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da652914-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '20227070c85a58b44be3b6ca9675f44e78222e58f58e5b3a263c895855344444'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da653012-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '1ed83f727b7863883c6b4037cbd2291157a169816ca253fbfcfe18aaf222dcbc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3506766134, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6538e6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '9d68deea3d63457108382ad432e698d886fc2d490a4e544c8bff308c2990ea0e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.584631', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'a
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: _gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da654700-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '7768fae8c2e0387a06d0c71e34241413c4b78499875949c0d1b0ed8356034982'}]}, 'timestamp': '2025-12-05 12:06:38.589424', '_unique_id': 'c60cc5ab571343dc86af1c26b9dfe704'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.583 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.591 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.591 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.591 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.593 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.593 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.593 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.593 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.593 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.594 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.594 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.594 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a44540d6-b759-4ebb-b7f8-108f459439f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da65a93e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': 'c0c91ba1f3785f730fc830f6c82ac662e2a9cb78e9e8f4230bb024a79278f2c9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da65b56e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '1ab85df985ecdfff60878ce297fa8af6196e08b02ad2861cf076de0f3f905e15'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da65e2c8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': '341685fcd2befbd3b829a839b1b7bfa5d3cfb0039952467f3332dd3dfaca90cf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da65ee44-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': '4a7a98485f0bc4c27a49840f81ec18fd70ce11174ceb072191e10111febfcff4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da65f826-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': 'e9125a0e05a426cb699b1f80d91e60f5a4f16d0e3b71a6724a1e3182602a97dc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id'
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: -05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da660334-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': '738a39a79339e00266887a783bd951347fe6309b6a61b3b47e93266a5e386084'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da660d52-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': 'a7274a75733fb48bedf6bc53ab418262b8c0f3b986bbc2d2bbaa6651eb451cd5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da6615cc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': '756965e19c2a2b872eab2e7f955cfa398bdfd8b11b2c89feb170981f985f980b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:38.591665', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da661e5a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': 'dcf2efa7d66146309ea58164c4d3f1f10d92397cdae1536664f70c35a9028e37'}]}, 'timestamp': '2025-12-05 12:06:38.594968', '_unique_id': '5bc27ab1ba7e43b8b6cf0f059624bce0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.596 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.596 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.596 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.597 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.597 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/cpu volume: 11310000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.597 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/cpu volume: 12410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.598 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.598 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/cpu volume: 11710000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.598 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/cpu volume: 11320000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.599 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/cpu volume: 11800000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.599 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/cpu volume: 10760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.599 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/cpu volume: 3620000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.599 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/cpu volume: 11860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.599 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/cpu volume: 12530000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.590 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is:  1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name' [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7aaaba2f-7866-4cb5-8a2b-d06ba0867d17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11310000000, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66861a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.606937381, 'message_signature': '8810c1c9c4a2f1b1b1f42091e3841be5f1788fcfee56054a75b65f667fbcca49'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12410000000, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66915a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.623031778, 'message_signature': '60f42b4baa717c61ba3a497dbef9b20ce652d5ad8957a535e51f2611def5260f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11710000000, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66b6ee-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.639152616, 'message_signature': '83b19c943ed97697a4508ba1daf1996cd3c95c2dc525315e5d2db2baa6440266'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11320000000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66be96-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.664076019, 'message_signature': 'ebb89db0d6e1191a4cc2cbbb3ebd68edc1b1216e7d0319c19e3b7c83d034612c'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11800000000, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66c79c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.68719179, 'message_signature': '9a73e04d7e29efe04dfcab5a76e6f9f0c70477d9004f0853eb144550d270fec6'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10760000000, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 72-fa163e006c52', 'monotonic_time': 3778.704908164, 'message_signature': '1a2e97726bdc926befb9641ea0c189a4f8adb5cb3890b478f29df9210d1387cf'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3620000000, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66d7dc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.725271815, 'message_signature': 'a1a854ff4601e22515b4de9cfb3e8d8fc4677e1999092d229fad6e50fe4dd987'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11860000000, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66df48-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.746155211, 'message_signature': 'df38ac90ad8fc555585b74ff5ea1a5a74c4cb758b30bd2c9e6e362cd56bb93d6'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12530000000, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'timestamp': '2025-12-05T12:06:38.597224', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da66e65a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.765158702, 'message_signature': '6fe9cbe6cf74862e27f044a5a4d7d401c66fa9026a9d29de5dd8924bec8c9a53'}]}, 'timestamp': '2025-12-05 12:06:38.600054', '_unique_id': 'e28780f9a1a2477a9e3b5cb493219c67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.601 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.601 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.read.bytes volume: 28776960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.602 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.602 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.read.bytes volume: 29788672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.602 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.603 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.603 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.read.bytes volume: 29989376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.604 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.604 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.read.bytes volume: 30439936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.604 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.605 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.read.bytes volume: 30042624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.605 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.605 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.read.bytes volume: 27073536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.605 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.read.bytes volume: 176446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.606 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.606 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.606 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.bytes volume: 30104064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.606 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.607 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.read.bytes volume: 31005184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.607 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.595 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa33a740-c9fa-40be-ace1-a10250f287f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28776960, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6734ac-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': 'b56c22a32da4dd9a6b1b88daaeee1b2937c0223570285fbdff3868649cf50e17'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da673e2a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': 'a57bd2ef92619c4c69b09335d17a721940af802b83d6b209a2d8b7421c30153f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29788672, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da674870-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': 'c43b943e7b15c13708bafbe3f79ea429d924f7702f2baf33d181e609b30ce2d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6754be-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': '9efb780fa0fb7789dce553bedc7e4704ac292d58b36039e007fa5e39b5ee82af'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29989376, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da67872c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': 'b859a4359fe57d0c416cd170c81ddfea281651f9b506a24e87271f686c6db8ea'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6791f4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '2c85c94c1cfdc865c57de546a2f0407e82379b320b5daaa3369020e328c47174'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30439936, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da679c12-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '44a007676c6265bb58b24a415cfc89c5b4dd1286aa6309edd431f4d9da93a318'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da67a9f0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '33a0414c4e4df388dcba2fd4d1ca0f1ae2c56a091d78fc0b0ffa9ec0794bfc49'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30042624, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da67b332-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '3782f23eec008dcd947f3b40086e61fcd4e7e1f423cfcb0d428fc075959890a8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da67ba6c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '351d0ea75eb9030a85c7eca07fad5dae8cd71fef21d515e5a54c290210316d47'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27073536, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da67c250-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': '391f8e2c9b435c7ff3493a2334fa97119fd9a23acd6208d2078ed34d0c9ba332'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 176446, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: : 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da67ce76-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': 'cf7a95b4fbdf839d7f7ce4d6cec7d3bfbfc43d2e7522c349a8036b9464ecc7c0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da67d8d0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '349b3fc5f8e69983d82584a3712c6e4e022de89e8521351ad3a0d6a28d4bb95c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da67e348-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': 'db82e7118c4db83409af5fc27b24fb6c93809b9417d3d5b99ee756ccab452f22'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30104064, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da67eee2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': 'a336da76c0a3d4b5a105e7c2f11eafb2c1c58060070fb1cd2f3870d8b1ff248b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da67f810-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '7e90345c9f1a737355570118c95615228207e97a9bcda1cf7dd731c401162fa7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31005184, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6802ba-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '12c3efb730d4ff5f34bbe2360aec2466571a761d5b603b17158ce939de49718b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.601822', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: a'}, 'message_id': 'da680aa8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '77c91146c8fbf306780a8a5fa4b4d842355ead79ee7a3a5649405b9d656a425b'}]}, 'timestamp': '2025-12-05 12:06:38.607524', '_unique_id': '3c104dc1254c4217a948a16be3cf7c66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.609 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.609 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.write.bytes volume: 72880128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.609 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.610 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.write.bytes volume: 72773632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.610 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.611 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.611 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.write.bytes volume: 72900608 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.611 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.611 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.612 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.612 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.write.bytes volume: 72695808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.612 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.612 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.write.bytes volume: 25628672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.612 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.613 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.613 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.613 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.bytes volume: 72998912 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.613 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.613 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.write.bytes volume: 72888320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.614 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.600 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '134553d5-3cb5-4e85-850a-2bc9536f63e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72880128, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da686a20-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': 'a4f9c7bc215aac52f11f65e2a6ec6bb3e756cd12981e33def80024c5fcf26ed3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68752e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '8cc24c82cc75911dd42ca334793f48f0aeaf472c04cb400aa7e5bd1a8c252a4a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72773632, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da687cae-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': '4495c114c2261232c6359a00965b737e6f4718274bc35b0bb3bc14e5d8247f41'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da688550-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': 'a5d8d94f81b7532d15537928715e4d06789686ae892620db78288bece6e81e1d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72900608, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da68a954-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '9de072bd67bb561db2ac38659cfeb9964458464dd6308acb2818fbd89a5149bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b38
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: y_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68b2fa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': 'e75689838f2388f9ea0eec05fffb59644ba5d64970050894fdaffcc8017d07b8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da68bb24-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': 'cb850020b89fa379e8fb1226733ede4fd1b0970e4ca68e18faf2e9c8cbda87f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68c81c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '012fedfdc7a120ab3709be2d708434b9dd64a8cbf067552ffa28438df335256b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72695808, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da68cfc4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '62c1d5315630a2b0661a9d71a1e3d3dbfa2b444294f099522968002771f55932'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68d6cc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': 'a034913498d8cc79e87b56351f7cdfe3c5ecd5c0ee799a8cf8fcbbf1a3fca10b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25628672, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da68de88-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': 'a10098387bff5f5c9e8a178f6285915f79386869790ef9fd23335e1440f1447e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'arch
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: ': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68eac2-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': 'e63a44d2c73ea666e893be7ecd280bb917509840706a3824150017879b3b35f4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da68f21a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': 'd9682717ae0f4a5afc54c3e27c8852114db18fee0a22751ea5ddb95e3d1a2b20'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68f922-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '146baac9c74544486cf1f5d63b0f9c05a5bf6220ab9468de3915e79970d31823'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72998912, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da690034-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': 'e4c96918746aa4390b7359b4e5f2dea74580ddb1c75c86908c0b48e118904615'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da69071e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '741d69d9baef648cd0f215b3950d61a8b60f70d17ba84cfd3a04b22db1309217'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72888320, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da69110a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '7b87464ce26a143ab928edc4318bf2db596cc792536a841139a40b8cd240dcce'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.609680', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '5554f81e09338e5ba011cf392dde6fdb1cc04770e8973ffcd02b3d58cc46bd4a'}]}, 'timestamp': '2025-12-05 12:06:38.614451', '_unique_id': 'fbd85d8d75554bb8972b1ab6c2563e32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.615 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.616 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/network.incoming.bytes volume: 1766 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.616 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/network.incoming.bytes volume: 1648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.617 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.617 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/network.incoming.bytes volume: 1478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.617 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/network.incoming.bytes volume: 1648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.617 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/network.incoming.bytes volume: 1430 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.617 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.618 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.618 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.618 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/network.incoming.bytes volume: 1850 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c69ea4d3-9302-4114-9a10-6f7acb36abb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1766, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-0000003a-b81bb939-d14f-4a72-b7fe-95fc5d8810a1-tap5683f8a8-69', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'tap5683f8a8-69', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:3c:38', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5683f8a8-69'}, 'message_id': 'da6961c8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.391893861, 'message_signature': 'dca3df8f7acf0528a755c2809133749089a3bb056a4a45f337eadcd55065531c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1648, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': 'instance-0000003e-5d70ac2d-111f-4e1b-ac26-3e02849b0458-tapac02dd63-5a', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'tapac02dd63-5a', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6a:c5:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapac02dd63-5a'}, 'message_id': 'da696a1a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.397804232, 'message_signature': '6b2e75fc8bd7bba2f5e38c646dfa751fb0fd7935acbd0b2609cd902d3e055701'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1478, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-00000037-472c7e2c-bdad-4230-904b-6937ceb872d2-tap9357c6a6-eb', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'tap9357c6a6-eb', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:08:e8:08', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9357c6a6-eb'}, 'message_id': 'da698df6-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.407313068, 'message_signature': 'f4b3d87ed6964fb2fadbb4501c0de9aead8fb8d93f4a87478459015d27179e7c'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1648, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000003c-25918fc4-05ec-4a16-b77f-ca1d352a2763-tap2064bfa7-12', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'tap2064bfa7-12', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:68:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2064bfa7-12'}, 'message_id': 'da6995ee-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.411047816, 'message_signature': 'c040a305fb600038f12981f6d87b3cae12b01484e2c4fe1f4c7d131a27d8c969'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1430, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': 'instance-0000003b-297d72ef-6b79-45b3-813b-52b5144b522e-tap821e6243-8d', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'tap821e6243-8d', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a6:47:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap821e6243-8d'}, 'message_id': 'da699d82-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.415441934, 'message_signature': '1901ca4c0c610ba987c381afb4dd8e8740e3631570fb22e6c74b3f89368f2792'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'instance-0000003d-bcdca3f9-3e24-4209-808c-8093b55e5c2d-tap8
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: empest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'tap88c7b630-e8', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bb:19:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88c7b630-e8'}, 'message_id': 'da69a50c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.419890753, 'message_signature': 'e15920ed7dd9a03fe0da8658b67f4e8606e93696e6b252da7cc36d2d0d43c084'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'instance-0000003f-e9f9bf08-7688-4213-91ff-74f2271ec71d-tap48b30c48-78', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'tap48b30c48-78', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:62:bb:58', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap48b30c48-78'}, 'message_id': 'da69b024-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.424289471, 'message_signature': '9d443e7e4e0ffe061b1d90e84bdea6d3a4213405f09f78201649395ea45649b6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': 'da69b812-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.428325918, 'message_signature': 'e266145f296d8f7345737189a6ed32f1c4b8c98d9560a7840b74d60405067109'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1850, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'instance-00000039-8888dd78-1c78-4065-8536-9a1096bdf57b-tapc5cb68aa-e5', 'timestamp': '2025-12-05T12:06:38.616033', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'tapc5cb68aa-e5', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:a8:16', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5cb68aa-e5'}, 'message_id': 'da69bf9c-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.43496405, 'message_signature': '333baf47bab16109880f582d6670e5c9dbc037d4e3801a61d3e0ebaec7117b88'}]}, 'timestamp': '2025-12-05 12:06:38.618709', '_unique_id': 'a0fe6515efdb43da9723453aab0bbef0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.608 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: : 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da67ce [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.620 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.620 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.read.latency volume: 296278000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.620 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.read.latency volume: 22596550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.620 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.read.latency volume: 268442554 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.621 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.read.latency volume: 39800296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.621 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.621 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.read.latency volume: 194632019 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.622 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.read.latency volume: 22825632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.622 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.read.latency volume: 252632619 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.622 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.read.latency volume: 43738699 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.622 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.read.latency volume: 305354112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.623 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.read.latency volume: 32449780 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.623 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.read.latency volume: 298073877 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.623 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.read.latency volume: 256339525 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.624 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.read.latency volume: 366072663 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.624 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.read.latency volume: 1761181 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.624 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.latency volume: 227447368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.624 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.latency volume: 33644734 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.625 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.read.latency volume: 375710481 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.626 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.read.latency volume: 23531071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa21226f-43fe-4ce3-8925-7f73f4fc3c3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 296278000, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a03bc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '63130161c56e925cf4a53ea4a8332429d34ace337a062308502cb21d8ce1a131'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22596550, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6a0e16-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '9ac32984f8f04d74480477022a455d85dd9e257453ebc2b9df8a44baeefe47c3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 268442554, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a1762-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': 'a48e55fb57506c0b77e4c5a6ca663103507b344e4d0e52b0a795816a1399cc3c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39800296, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6a2432-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': '4854277bbe74fd530657e045703102c375c1f566f30bc2f7008e058aa162ba25'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 194632019, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a4958-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '956034644a86cce7a8caa05c28403e57b644775f3177eef3651242555d680a23'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22825632, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6a548e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': 'f1320dc0b4ac67f806acab01047aa4a8564b012eeb5efbf3b0c8466316689db0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 252632619, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a5bdc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '82b92c30fcaf49c3c49bb2ce5d72ef21c8af0725ae1196f2188be4ecbef5df32'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43738699, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6a6302-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '4d567a632c27128b8ad785a889b3afddf7aa8f8e5841696073b6f4d26812d82f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 305354112, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a6b22-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '92788ba187893435bb3e56b91f79b88c8630e1bca84b84317af0668bea756c1d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 32449780, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6a78d8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': '34c5a542c06ac49c60e74b89e03836dc7f8d50bc1ad96d087b380e4e787af5c6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 298073877, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a835a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': '5ad5c80388fe8967d4b618b24c0c0cd1ca5b95d6fae31d3a46b93ebb8fe8f031'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 256339525, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_r
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: s_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6a8d32-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': '91f5ee97039720bfb511409b99afdbf8c83b4b45222299099b9ed5459644bb0c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 366072663, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6a97fa-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': 'd35bef26877e8adec6e4b20477ff01bc5a4c6fdc38672ad541cef06e6dc938f6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1761181, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6aa3d0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': 'ba3c7e2cbc1b0d9c87a9dfd962e2b6761de24a6a9a853a1e074b92b34f1219a0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 227447368, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6aadda-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '83368324e0fa15e7b08fcef9f734da3e1de48e45e3d87bc6327ac760921a8719'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 33644734, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6ad6d4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '0427784b4b03afec5690ce0965daf9d807c28d78d7c735c21f9a120f006ceb5c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 375710481, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6ae534-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '8beff11ef08d59174b26b6eccb11ef354dbc714f87f1f1dd3a323de539179c91'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23531071, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.620145', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-8
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: , 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6af10a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': 'bbbdbf079256f79790c8da023741086e43f4329741de4e4c187c870fd65c1223'}]}, 'timestamp': '2025-12-05 12:06:38.626580', '_unique_id': '6caec8cac496469f96ab22828d69d5a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.615 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: y_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'},  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: ': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da68eac2-d1d2-11f0-8572- [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.629 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.629 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.629 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.629 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.629 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.read.requests volume: 1035 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.630 12 DEBUG ceilometer.compute.pollsters [-] b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.630 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.read.requests volume: 1072 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.630 12 DEBUG ceilometer.compute.pollsters [-] 5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.631 12 DEBUG ceilometer.compute.pollsters [-] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000038, id=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.631 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.read.requests volume: 1073 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.632 12 DEBUG ceilometer.compute.pollsters [-] 472c7e2c-bdad-4230-904b-6937ceb872d2/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.632 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.read.requests volume: 1101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.632 12 DEBUG ceilometer.compute.pollsters [-] 25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.633 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.read.requests volume: 1076 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.633 12 DEBUG ceilometer.compute.pollsters [-] 297d72ef-6b79-45b3-813b-52b5144b522e/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.633 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.read.requests volume: 926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.634 12 DEBUG ceilometer.compute.pollsters [-] bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk.device.read.requests volume: 73 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.634 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.634 12 DEBUG ceilometer.compute.pollsters [-] e9f9bf08-7688-4213-91ff-74f2271ec71d/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.635 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.requests volume: 1087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.635 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.637 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.read.requests volume: 1132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.637 12 DEBUG ceilometer.compute.pollsters [-] 8888dd78-1c78-4065-8536-9a1096bdf57b/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.619 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90c8ea52-c9c3-4021-91ff-bd09aa563bd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1035, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6b7a08-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '74de7eb892269e4f278bdc76e100314c6f729b709cb222c9fb48c969c17f603d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1462907521', 'name': 'instance-0000003a', 'instance_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'instance_type': 'm1.micro', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '09233d41-3279-4f39-ac6e-a21662b4f176', 'name': 'm1.micro', 'vcpus': 1, 'ram': 192, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 192, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6b876e-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.795970576, 'message_signature': '47fcf20b081ec0c32c8088b00037d6f8f6af673bd90593a62deb9ad55b9e73e5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1072, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6b9268-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': 'e7200723662aecf7a74c308c18e7de4a106c5581db9fb4388679eb3eabf0eaf5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': 'bc4332be3b424a5e996b61b244505cfc', 'user_name': None, 'project_id': '6d62df5807554f499d26b5fc77ec8603', 'project_name': None, 'resource_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-AttachVolumeShelveTestJSON-server-795100487', 'name': 'instance-0000003e', 'instance_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'instance_type': 'm1.nano', 'host': 'e2508d951bc4d590047127d476550a13a6a4f88f76bdd07811ff8184', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6b9f92-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.82125414, 'message_signature': '50e2c98cde323cab826c3076e4e661609d9b7df94ea14fbe0819dfa842e555d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1073, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6bcd5a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': '01763b5d7deda5aa14d6321eca8ea20a4208429b7f2a8e86606b943437a809b6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '472c7e2c-bdad-4230-904b-6937ceb872d2-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-292918791', 'name': 'instance-00000037', 'instance_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: _64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6bd944-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.854796903, 'message_signature': 'd8db34c1e2df5ab7b0cce71f20cc968aaa804fc409321426956acea0681c65d3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1101, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6be420-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '52f026c35ab82136e3b6dfd875bc2c1f926c53efe7d4e2b6062b4f37c026cee9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1604830094', 'name': 'instance-0000003c', 'instance_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6bf1e0-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.880301793, 'message_signature': '217519ac7265777e0c0f343fd803ba73872b611942071bc26388e566deded3d5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1076, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6bfdd4-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': 'c758ae2b7f7bee709c9fc5401f1c5e6df8da4fbda3f09f460738620cb6a66b29'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '8cf2534e7c394130b675e44ed567401b', 'user_name': None, 'project_id': '85037de7275442698e604ee3f6283cbc', 'project_name': None, 'resource_id': '297d72ef-6b79-45b3-813b-52b5144b522e-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-FloatingIPsAssociationTestJSON-server-2111676304', 'name': 'instance-0000003b', 'instance_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'instance_type': 'm1.nano', 'host': 'd59ce66c2f543614250614943cfc7236e5a739697580bb4cbe790cdd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6c0888-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.906995288, 'message_signature': 'eb2a3e6cb02c39ad34092b6fd28dd87debd18c24ead8a80717cfa7ed82510a57'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 926, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6c135a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': '1ad4de852ae5273c4824edbe4daa346f1169118dd9cda6b19a048ee3b46133e4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 73, 'user_id': '6b73160d333a43ed94d4258262e3c2b5', 'user_name': None, 'project_id': '5285f99befb24ac285be8e4fc1d18e69', 'project_name': None, 'resource_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesUnderV243Test-server-2105634627', 'name': 'instance-0000003d', 'instance_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'instance_type': 'm1.nano', 'host': '3b5087fa4e86c7f97aed11cf1a48a122a6f42c99cc10f97c187cb6b6', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'},
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 6_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6c22c8-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.9377245, 'message_signature': '232c0e890fe5f59d14e8035ef2553674db4d3f4cf34404042b7250f9d37274fb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6c2e3a-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': '6515a70f62151837a4abff79a24d6de49c246e5bc8ceff0769601fa563f541b8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_name': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_name': None, 'resource_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'name': 'instance-0000003f', 'instance_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'instance_type': 'm1.nano', 'host': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6c3934-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3778.983328273, 'message_signature': 'ebb2aab4622e44c73a04ce3317f82f3813f006331204a2fec5bfd30f63b235ed'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1087, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6c48fc-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': '00ed4b679438c767cf162c86ac682104adb863ca9a705f1f5535e2acc635aab5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6c80ec-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.023095147, 'message_signature': 'b36d1244334471aec5bec8f0bad946dd67d9e5e994b56c7c980ca9ae0b495a93'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1132, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-vda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e277715-617f-4e35-89c7-208beae9fd5c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da6ca3ec-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': '66c8171b811437072a356eb39f71664c45a91d8b277c5efddbaa916547acc973'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4f8149b8192e411a9131b103b25862b6', 'user_name': None, 'project_id': 'e8f613c8797e432d96e43223fb7c476d', 'project_name': None, 'resource_id': '8888dd78-1c78-4065-8536-9a1096bdf57b-sda', 'timestamp': '2025-12-05T12:06:38.629711', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-2001854085', 'name': 'instance-00000039', 'instance_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'instance_type': 'm1.nano', 'host': 'f999410875d069cb1c0f2431557bd52ce132e9c7ccf21e0289236042', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6e277715-617f-4e35-89c7-208beae9fd5c'}, 'image_ref': '6e27771
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: ', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da6cad24-d1d2-11f0-8572-fa163e006c52', 'monotonic_time': 3779.052347446, 'message_signature': 'eb2bf9c380188a0039e26faa4320698bf8d1fb2ca669f5230e7ff7ae749781eb'}]}, 'timestamp': '2025-12-05 12:06:38.637937', '_unique_id': '8e278e94251542f89677423aa1959fa6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.641 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.641 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:06:38 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:06:38.641 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1462907521>, <NovaLikeServer: tempest-AttachVolumeShelveTestJSON-server-795100487>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1365452817>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-292918791>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1604830094>, <NovaLikeServer: tempest-FloatingIPsAssociationTestJSON-server-2111676304>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-2105634627>, <NovaLikeServer: tempest-SecurityGroupsTestJSON-server-1685847021>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1629320086>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-2001854085>]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.627 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: s_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, ' [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:06:38.639 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: _64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_g [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 6_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_ [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.867 187212 INFO nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Creating config drive at /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.config#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.875 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ddna3ky execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.966 187212 DEBUG nova.network.neutron [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updated VIF entry in instance network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.967 187212 DEBUG nova.network.neutron [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.988 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.989 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-changed-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.989 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Refreshing instance network info cache due to event network-changed-821e6243-8d28-4c8c-874c-f1e69c7d3bed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.990 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.990 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:38 np0005546909 nova_compute[187208]: 2025-12-05 12:06:38.990 187212 DEBUG nova.network.neutron [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Refreshing network info cache for port 821e6243-8d28-4c8c-874c-f1e69c7d3bed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.010 187212 DEBUG oslo_concurrency.processutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ddna3ky" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:39 np0005546909 kernel: tapd10caa85-df: entered promiscuous mode
Dec  5 07:06:39 np0005546909 NetworkManager[55691]: <info>  [1764936399.0714] manager: (tapd10caa85-df): new Tun device (/org/freedesktop/NetworkManager/Devices/217)
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.078 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:39Z|00502|binding|INFO|Claiming lport d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 for this chassis.
Dec  5 07:06:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:39Z|00503|binding|INFO|d10caa85-dfcd-49ce-8ff7-2c2a68d1d731: Claiming fa:16:3e:cc:8d:e9 10.100.0.10
Dec  5 07:06:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:39Z|00504|binding|INFO|Setting lport d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 ovn-installed in OVS
Dec  5 07:06:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:39Z|00505|binding|INFO|Setting lport d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 up in Southbound
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.145 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:8d:e9 10.100.0.10'], port_security=['fa:16:3e:cc:8d:e9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ed00d159-9d70-481e-93be-ea180fea04ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59233d66-44e6-47b3-b612-4f7d677af03d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc1fd38e325f4a2caa75aeab79da75d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cb353a76-4787-4857-933e-e95743324e9e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f37497c0-7b03-4b0b-94d8-7ed5a2c705cb, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.147 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 in datapath 59233d66-44e6-47b3-b612-4f7d677af03d bound to our chassis#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.152 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 59233d66-44e6-47b3-b612-4f7d677af03d#033[00m
Dec  5 07:06:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:39Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bb:19:b7 10.100.0.7
Dec  5 07:06:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:39Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bb:19:b7 10.100.0.7
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.150 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.166 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb9f4c3-ce3f-4ef5-8929-02fd699ed5e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.167 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap59233d66-41 in ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:06:39 np0005546909 systemd-machined[153543]: New machine qemu-68-instance-00000040.
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.170 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap59233d66-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.170 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5e6695cd-bcd7-4188-936d-e1c062efb5c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.173 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dca72c4a-15b2-45d0-97da-b10f05f24ec1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 systemd[1]: Started Virtual Machine qemu-68-instance-00000040.
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.186 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[115314f5-617d-4695-92b7-b735289b384d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 systemd-udevd[227041]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:06:39 np0005546909 NetworkManager[55691]: <info>  [1764936399.2027] device (tapd10caa85-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:06:39 np0005546909 NetworkManager[55691]: <info>  [1764936399.2039] device (tapd10caa85-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.205 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[790cc2fd-28ed-46f9-956a-9e8d36b6cbb3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.241 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[23ddb75d-881a-4ccb-b995-e17c55b1c9cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 systemd-udevd[227044]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:06:39 np0005546909 NetworkManager[55691]: <info>  [1764936399.2486] manager: (tap59233d66-40): new Veth device (/org/freedesktop/NetworkManager/Devices/218)
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.247 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d9dfdb30-515b-489b-98b0-218c77c8acc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.286 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3043ad1f-a879-4a42-b5f5-54f1bf4ed3e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.290 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4020b9f1-e42c-498e-b7f0-f6d566030ce7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 NetworkManager[55691]: <info>  [1764936399.3125] device (tap59233d66-40): carrier: link connected
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.319 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0ff067d2-a0af-42e1-b4d4-176e3e535734]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.339 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8325463-7746-4690-b125-70012715f990]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap59233d66-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:70:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377987, 'reachable_time': 23411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227075, 'error': None, 'target': 'ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.356 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a5f1289b-563f-4c06-8c85-ec4437542bdc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4e:7074'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377987, 'tstamp': 377987}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227076, 'error': None, 'target': 'ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.378 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[931b7a72-94eb-4e6f-9e0f-55e621da2341]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap59233d66-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4e:70:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377987, 'reachable_time': 23411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227077, 'error': None, 'target': 'ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.394 187212 DEBUG nova.network.neutron [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Updated VIF entry in instance network info cache for port d10caa85-dfcd-49ce-8ff7-2c2a68d1d731. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.395 187212 DEBUG nova.network.neutron [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Updating instance_info_cache with network_info: [{"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.408 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[54c5eb9b-a04c-4938-a139-fcb908714311]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.424 187212 DEBUG oslo_concurrency.lockutils [req-fbda2095-1377-4b6d-b242-7ebdf3d786a5 req-f7e8eeae-2ccc-4534-94f9-df1ef44afd2c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.470 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[47968083-6110-4d0b-b040-3f46a55bce53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.471 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59233d66-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.471 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.472 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap59233d66-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.474 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:39 np0005546909 NetworkManager[55691]: <info>  [1764936399.4748] manager: (tap59233d66-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/219)
Dec  5 07:06:39 np0005546909 kernel: tap59233d66-40: entered promiscuous mode
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.479 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.480 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap59233d66-40, col_values=(('external_ids', {'iface-id': '229f26d0-355d-483b-86df-f1f319e2601e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.482 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:39 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:39Z|00506|binding|INFO|Releasing lport 229f26d0-355d-483b-86df-f1f319e2601e from this chassis (sb_readonly=0)
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.497 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.498 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/59233d66-44e6-47b3-b612-4f7d677af03d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/59233d66-44e6-47b3-b612-4f7d677af03d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.500 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[24c22ac3-7515-44cc-a902-5c617503f4a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.501 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-59233d66-44e6-47b3-b612-4f7d677af03d
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/59233d66-44e6-47b3-b612-4f7d677af03d.pid.haproxy
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 59233d66-44e6-47b3-b612-4f7d677af03d
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:06:39 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:39.504 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d', 'env', 'PROCESS_TAG=haproxy-59233d66-44e6-47b3-b612-4f7d677af03d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/59233d66-44e6-47b3-b612-4f7d677af03d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.595 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.698 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.699 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.699 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.699 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.706 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936399.7061496, ed00d159-9d70-481e-93be-ea180fea04ba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.707 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] VM Started (Lifecycle Event)#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.766 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.772 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936399.7107768, ed00d159-9d70-481e-93be-ea180fea04ba => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.773 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.816 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.819 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:39 np0005546909 nova_compute[187208]: 2025-12-05 12:06:39.861 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:39 np0005546909 podman[227121]: 2025-12-05 12:06:39.934534457 +0000 UTC m=+0.060385203 container create 44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:06:39 np0005546909 systemd[1]: Started libpod-conmon-44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b.scope.
Dec  5 07:06:39 np0005546909 podman[227121]: 2025-12-05 12:06:39.905414592 +0000 UTC m=+0.031265368 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:06:40 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:06:40 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0d77ec8b1cc024940e91355c3fecada2e5d7bf69ad2fc36a67c677303a54e3e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:06:40 np0005546909 podman[227121]: 2025-12-05 12:06:40.02733033 +0000 UTC m=+0.153181106 container init 44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  5 07:06:40 np0005546909 podman[227121]: 2025-12-05 12:06:40.036460565 +0000 UTC m=+0.162311311 container start 44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 07:06:40 np0005546909 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [NOTICE]   (227141) : New worker (227143) forked
Dec  5 07:06:40 np0005546909 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [NOTICE]   (227141) : Loading success.
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.264 187212 DEBUG nova.compute.manager [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.266 187212 DEBUG oslo_concurrency.lockutils [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.267 187212 DEBUG oslo_concurrency.lockutils [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.267 187212 DEBUG oslo_concurrency.lockutils [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.267 187212 DEBUG nova.compute.manager [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.268 187212 WARNING nova.compute.manager [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state stopped and task_state powering-on.#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.268 187212 DEBUG nova.compute.manager [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.269 187212 DEBUG oslo_concurrency.lockutils [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.269 187212 DEBUG oslo_concurrency.lockutils [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.269 187212 DEBUG oslo_concurrency.lockutils [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.270 187212 DEBUG nova.compute.manager [req-d67fcb8a-a42e-49f2-bac9-0ec424fcfb9b req-bacbbde3-de5d-4302-aca6-8ceb0828b801 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Processing event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.271 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.280 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.282 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936400.2779553, ed00d159-9d70-481e-93be-ea180fea04ba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.282 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.289 187212 INFO nova.virt.libvirt.driver [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Instance spawned successfully.#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.290 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.320 187212 DEBUG nova.network.neutron [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Updating instance_info_cache with network_info: [{"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.358 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.359 187212 DEBUG oslo_concurrency.lockutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Releasing lock "refresh_cache-cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.370 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.371 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.372 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.372 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.373 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.374 187212 DEBUG nova.virt.libvirt.driver [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.378 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.393 187212 INFO nova.virt.libvirt.driver [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance destroyed successfully.#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.395 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'numa_topology' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.437 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.439 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'resources' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.467 187212 DEBUG nova.virt.libvirt.vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1365452817',display_name='tempest-ListServerFiltersTestJSON-instance-1365452817',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1365452817',id=56,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-6r1u1q6j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:34Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.468 187212 DEBUG nova.network.os_vif_util [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.469 187212 DEBUG nova.network.os_vif_util [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.469 187212 DEBUG os_vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.473 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.473 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap549318e9-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.475 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.477 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.478 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.480 187212 INFO os_vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6')#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.490 187212 DEBUG nova.virt.libvirt.driver [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Start _get_guest_xml network_info=[{"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.496 187212 WARNING nova.virt.libvirt.driver [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.502 187212 DEBUG nova.virt.libvirt.host [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.504 187212 DEBUG nova.virt.libvirt.host [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.511 187212 DEBUG nova.virt.libvirt.host [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.512 187212 DEBUG nova.virt.libvirt.host [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.513 187212 DEBUG nova.virt.libvirt.driver [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.513 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.514 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.515 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.515 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.516 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.516 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.517 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.518 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.518 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.519 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.520 187212 DEBUG nova.virt.hardware [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.520 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'vcpu_model' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.525 187212 INFO nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Took 6.09 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.526 187212 DEBUG nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.543 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.607 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.608 187212 DEBUG oslo_concurrency.lockutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.609 187212 DEBUG oslo_concurrency.lockutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.610 187212 DEBUG oslo_concurrency.lockutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.611 187212 DEBUG nova.virt.libvirt.vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1365452817',display_name='tempest-ListServerFiltersTestJSON-instance-1365452817',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1365452817',id=56,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-6r1u1q6j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:34Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.611 187212 DEBUG nova.network.os_vif_util [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.613 187212 DEBUG nova.network.os_vif_util [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.614 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'pci_devices' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.683 187212 INFO nova.compute.manager [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Took 6.94 seconds to build instance.#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.690 187212 DEBUG nova.virt.libvirt.driver [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  <uuid>cbcd4733-8c53-4696-9bc0-6e5c516c9dcf</uuid>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  <name>instance-00000038</name>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1365452817</nova:name>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:06:40</nova:creationTime>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:        <nova:user uuid="4f8149b8192e411a9131b103b25862b6">tempest-ListServerFiltersTestJSON-711798252-project-member</nova:user>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:        <nova:project uuid="e8f613c8797e432d96e43223fb7c476d">tempest-ListServerFiltersTestJSON-711798252</nova:project>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:        <nova:port uuid="549318e9-e629-4e2c-8cbb-3cd263c2bc34">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <entry name="serial">cbcd4733-8c53-4696-9bc0-6e5c516c9dcf</entry>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <entry name="uuid">cbcd4733-8c53-4696-9bc0-6e5c516c9dcf</entry>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk.config"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:9b:d7:ed"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <target dev="tap549318e9-e6"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/console.log" append="off"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <input type="keyboard" bus="usb"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:06:40 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:06:40 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:06:40 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:06:40 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.696 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.725 187212 DEBUG oslo_concurrency.lockutils [None req-3597cd9c-a20b-4756-a94b-db99a4f000ac 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.768 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.769 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.834 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.837 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'trusted_certs' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.853 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.913 187212 DEBUG nova.network.neutron [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updated VIF entry in instance network info cache for port 821e6243-8d28-4c8c-874c-f1e69c7d3bed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.914 187212 DEBUG nova.network.neutron [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updating instance_info_cache with network_info: [{"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.924 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.925 187212 DEBUG nova.virt.disk.api [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Checking if we can resize image /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.925 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.950 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.951 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.952 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.952 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.953 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.953 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.953 187212 WARNING nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state stopped and task_state None.#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.954 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.954 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.954 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.954 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.955 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.955 187212 WARNING nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state stopped and task_state None.#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.955 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.956 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.956 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.956 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.956 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.957 187212 WARNING nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state stopped and task_state None.#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.957 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.957 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.958 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.958 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.958 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.958 187212 WARNING nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state stopped and task_state None.#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.959 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.959 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.959 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.960 187212 DEBUG oslo_concurrency.lockutils [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.960 187212 DEBUG nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.960 187212 WARNING nova.compute.manager [req-9c2ff217-33ae-4e8d-99c5-84af8d256d40 req-9daea93d-cc75-4ff9-8873-2315abc6ed6f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state stopped and task_state None.#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.997 187212 DEBUG oslo_concurrency.processutils [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.998 187212 DEBUG nova.virt.disk.api [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Cannot resize image /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:06:40 np0005546909 nova_compute[187208]: 2025-12-05 12:06:40.999 187212 DEBUG nova.objects.instance [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'migration_context' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.012 187212 DEBUG nova.virt.libvirt.vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1365452817',display_name='tempest-ListServerFiltersTestJSON-instance-1365452817',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1365452817',id=56,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-6r1u1q6j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:06:34Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.012 187212 DEBUG nova.network.os_vif_util [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.014 187212 DEBUG nova.network.os_vif_util [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.014 187212 DEBUG os_vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.015 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.016 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.016 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.019 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.020 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap549318e9-e6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.020 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap549318e9-e6, col_values=(('external_ids', {'iface-id': '549318e9-e629-4e2c-8cbb-3cd263c2bc34', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:d7:ed', 'vm-uuid': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.032 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:41 np0005546909 NetworkManager[55691]: <info>  [1764936401.0350] manager: (tap549318e9-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.036 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.044 187212 DEBUG nova.compute.manager [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-changed-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.044 187212 DEBUG nova.compute.manager [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Refreshing instance network info cache due to event network-changed-48b30c48-7858-408b-aeab-df46f6277546. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.045 187212 DEBUG oslo_concurrency.lockutils [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.045 187212 DEBUG oslo_concurrency.lockutils [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.045 187212 DEBUG nova.network.neutron [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Refreshing network info cache for port 48b30c48-7858-408b-aeab-df46f6277546 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.046 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.048 187212 INFO os_vif [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6')#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:06:41 np0005546909 NetworkManager[55691]: <info>  [1764936401.1301] manager: (tap549318e9-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/221)
Dec  5 07:06:41 np0005546909 kernel: tap549318e9-e6: entered promiscuous mode
Dec  5 07:06:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:41Z|00507|binding|INFO|Claiming lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 for this chassis.
Dec  5 07:06:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:41Z|00508|binding|INFO|549318e9-e629-4e2c-8cbb-3cd263c2bc34: Claiming fa:16:3e:9b:d7:ed 10.100.0.9
Dec  5 07:06:41 np0005546909 systemd-udevd[227067]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.135 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.145 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:d7:ed 10.100.0.9'], port_security=['fa:16:3e:9b:d7:ed 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '7', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=549318e9-e629-4e2c-8cbb-3cd263c2bc34) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.146 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 bound to our chassis#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.149 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63#033[00m
Dec  5 07:06:41 np0005546909 NetworkManager[55691]: <info>  [1764936401.1518] device (tap549318e9-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:06:41 np0005546909 NetworkManager[55691]: <info>  [1764936401.1529] device (tap549318e9-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:06:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:41Z|00509|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 ovn-installed in OVS
Dec  5 07:06:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:41Z|00510|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 up in Southbound
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.160 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.177 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af2d3eb1-9c85-4e8a-918f-0c13fb0ce2fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:41 np0005546909 systemd-machined[153543]: New machine qemu-69-instance-00000038.
Dec  5 07:06:41 np0005546909 systemd[1]: Started Virtual Machine qemu-69-instance-00000038.
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.210 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ab650c-c533-400f-b01b-21a7c59ff973]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.217 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[54b0af44-f8b3-4e3d-82a9-77a83b506309]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.256 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[798bd06c-103e-4840-8ea2-9e7d307faa17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.309 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed5a05ab-b243-4c8d-b91b-d52f00f3e9c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227195, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.328 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[03ea06a1-8006-4cf0-915f-144e4fa00539]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227197, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227197, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.330 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.332 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.335 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.335 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.336 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:41.336 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.643 187212 DEBUG nova.compute.manager [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.644 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for cbcd4733-8c53-4696-9bc0-6e5c516c9dcf due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.645 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936401.642585, cbcd4733-8c53-4696-9bc0-6e5c516c9dcf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.646 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.650 187212 INFO nova.virt.libvirt.driver [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance rebooted successfully.#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.651 187212 DEBUG nova.compute.manager [None req-0ef7427b-456c-49d6-ada4-700fd9c0b61f 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.712 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.718 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.756 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936401.6429155, cbcd4733-8c53-4696-9bc0-6e5c516c9dcf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.757 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] VM Started (Lifecycle Event)#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.816 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:41 np0005546909 nova_compute[187208]: 2025-12-05 12:06:41.822 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:06:42 np0005546909 nova_compute[187208]: 2025-12-05 12:06:42.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:06:42 np0005546909 nova_compute[187208]: 2025-12-05 12:06:42.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:06:42 np0005546909 nova_compute[187208]: 2025-12-05 12:06:42.068 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:06:42 np0005546909 nova_compute[187208]: 2025-12-05 12:06:42.237 187212 INFO nova.virt.libvirt.driver [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot image upload complete#033[00m
Dec  5 07:06:42 np0005546909 nova_compute[187208]: 2025-12-05 12:06:42.238 187212 INFO nova.compute.manager [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 6.10 seconds to snapshot the instance on the hypervisor.#033[00m
Dec  5 07:06:42 np0005546909 nova_compute[187208]: 2025-12-05 12:06:42.502 187212 DEBUG nova.compute.manager [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Found 3 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Dec  5 07:06:42 np0005546909 nova_compute[187208]: 2025-12-05 12:06:42.503 187212 DEBUG nova.compute.manager [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Rotating out 1 backups _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4458#033[00m
Dec  5 07:06:42 np0005546909 nova_compute[187208]: 2025-12-05 12:06:42.504 187212 DEBUG nova.compute.manager [None req-e8529136-417d-46c9-82e8-daa05b0da777 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Deleting image aa21033c-b586-4741-8de3-906338ad12ee _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4463#033[00m
Dec  5 07:06:42 np0005546909 nova_compute[187208]: 2025-12-05 12:06:42.625 187212 DEBUG nova.network.neutron [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updated VIF entry in instance network info cache for port 48b30c48-7858-408b-aeab-df46f6277546. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:42 np0005546909 nova_compute[187208]: 2025-12-05 12:06:42.626 187212 DEBUG nova.network.neutron [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updating instance_info_cache with network_info: [{"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:42 np0005546909 nova_compute[187208]: 2025-12-05 12:06:42.652 187212 DEBUG oslo_concurrency.lockutils [req-9da310ee-4024-4869-bc71-f612d2b6c99d req-66288f10-22e8-45ae-aebb-950834e677e0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:42 np0005546909 nova_compute[187208]: 2025-12-05 12:06:42.671 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.082 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.083 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.084 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.084 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.164 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.166 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.166 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.167 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.167 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] No waiting events found dispatching network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.168 187212 WARNING nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received unexpected event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.168 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-changed-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.168 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Refreshing instance network info cache due to event network-changed-48b30c48-7858-408b-aeab-df46f6277546. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.169 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.169 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.170 187212 DEBUG nova.network.neutron [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Refreshing network info cache for port 48b30c48-7858-408b-aeab-df46f6277546 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:43 np0005546909 podman[227206]: 2025-12-05 12:06:43.24224075 +0000 UTC m=+0.093289918 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.286 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.358 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.359 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.423 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.431 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.516 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.518 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.597 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.605 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.682 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.683 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.708 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.709 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.710 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.710 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.710 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.712 187212 INFO nova.compute.manager [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Terminating instance#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.713 187212 DEBUG nova.compute.manager [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:06:43 np0005546909 kernel: tap821e6243-8d (unregistering): left promiscuous mode
Dec  5 07:06:43 np0005546909 NetworkManager[55691]: <info>  [1764936403.7407] device (tap821e6243-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:06:43 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:43Z|00511|binding|INFO|Releasing lport 821e6243-8d28-4c8c-874c-f1e69c7d3bed from this chassis (sb_readonly=0)
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.755 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:43 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:43Z|00512|binding|INFO|Setting lport 821e6243-8d28-4c8c-874c-f1e69c7d3bed down in Southbound
Dec  5 07:06:43 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:43Z|00513|binding|INFO|Removing iface tap821e6243-8d ovn-installed in OVS
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.763 187212 DEBUG nova.compute.manager [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-changed-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.764 187212 DEBUG nova.compute.manager [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Refreshing instance network info cache due to event network-changed-821e6243-8d28-4c8c-874c-f1e69c7d3bed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.764 187212 DEBUG oslo_concurrency.lockutils [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.764 187212 DEBUG oslo_concurrency.lockutils [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.765 187212 DEBUG nova.network.neutron [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Refreshing network info cache for port 821e6243-8d28-4c8c-874c-f1e69c7d3bed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.767 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.782 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.786 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf/disk --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.791 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:43 np0005546909 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Dec  5 07:06:43 np0005546909 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000003b.scope: Consumed 13.696s CPU time.
Dec  5 07:06:43 np0005546909 systemd-machined[153543]: Machine qemu-63-instance-0000003b terminated.
Dec  5 07:06:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.886 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:47:26 10.100.0.9'], port_security=['fa:16:3e:a6:47:26 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '297d72ef-6b79-45b3-813b-52b5144b522e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f4c4888-4b32-4259-8441-31af091e0c7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85037de7275442698e604ee3f6283cbc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed3fff5f-a24a-492e-ba85-8f010d446cfc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac2e7e6b-9342-46f8-a910-5de5a261f0a9, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=821e6243-8d28-4c8c-874c-f1e69c7d3bed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.887 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 821e6243-8d28-4c8c-874c-f1e69c7d3bed in datapath 0f4c4888-4b32-4259-8441-31af091e0c7d unbound from our chassis#033[00m
Dec  5 07:06:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.890 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0f4c4888-4b32-4259-8441-31af091e0c7d#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.892 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.897 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.905 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[abcaf5c0-e078-43e1-9d40-177753fdb042]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.951 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e56e4c08-8222-4e88-8128-3264533fcd19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.956 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b087bb03-78d8-41d8-806d-86acaaf9e04c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.984 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:43 np0005546909 nova_compute[187208]: 2025-12-05 12:06:43.994 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:43 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:43.997 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3a3ed0ee-22fd-4552-b756-cc3e1f956995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.018 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c80c2de7-dbfa-4db9-8181-4a7c6a3a75f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f4c4888-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:45:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 126], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372337, 'reachable_time': 25070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227283, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.019 187212 INFO nova.virt.libvirt.driver [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Instance destroyed successfully.#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.020 187212 DEBUG nova.objects.instance [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lazy-loading 'resources' on Instance uuid 297d72ef-6b79-45b3-813b-52b5144b522e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.036 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7495b3-c7aa-4de8-a742-2e91dd005233]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0f4c4888-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372348, 'tstamp': 372348}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227285, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0f4c4888-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372350, 'tstamp': 372350}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227285, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.037 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f4c4888-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.038 187212 DEBUG nova.virt.libvirt.vif [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-2111676304',display_name='tempest-FloatingIPsAssociationTestJSON-server-2111676304',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-2111676304',id=59,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='85037de7275442698e604ee3f6283cbc',ramdisk_id='',reservation_id='r-3sf4jdpp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-883508882',owner_user_name='tempest-FloatingIPsAssociationTestJSON-883508882-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:20Z,user_data=None,user_id='8cf2534e7c394130b675e44ed567401b',uuid=297d72ef-6b79-45b3-813b-52b5144b522e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.040 187212 DEBUG nova.network.os_vif_util [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converting VIF {"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.041 187212 DEBUG nova.network.os_vif_util [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.042 187212 DEBUG os_vif [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.044 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.044 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f4c4888-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.045 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.045 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0f4c4888-40, col_values=(('external_ids', {'iface-id': 'b2e28c8a-557d-459b-807e-dd1f5be0a608'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.045 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap821e6243-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.045 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.049 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.051 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.054 187212 INFO os_vif [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a6:47:26,bridge_name='br-int',has_traffic_filtering=True,id=821e6243-8d28-4c8c-874c-f1e69c7d3bed,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap821e6243-8d')#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.054 187212 INFO nova.virt.libvirt.driver [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Deleting instance files /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e_del#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.055 187212 INFO nova.virt.libvirt.driver [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Deletion of /var/lib/nova/instances/297d72ef-6b79-45b3-813b-52b5144b522e_del complete#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.065 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.066 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.128 187212 INFO nova.compute.manager [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.129 187212 DEBUG oslo.service.loopingcall [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.130 187212 DEBUG nova.compute.manager [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.130 187212 DEBUG nova.network.neutron [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.134 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.137 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Error from libvirt while getting description of instance-0000003b: [Error Code 42] Domain not found: no domain with matching uuid '297d72ef-6b79-45b3-813b-52b5144b522e' (instance-0000003b): libvirt.libvirtError: Domain not found: no domain with matching uuid '297d72ef-6b79-45b3-813b-52b5144b522e' (instance-0000003b)#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.142 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.168 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.169 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:44.170 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.217 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.218 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.289 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.298 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.369 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.370 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.450 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.458 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.534 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.537 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.600 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.609 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.672 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.674 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.737 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.744 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.805 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.808 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:06:44 np0005546909 nova_compute[187208]: 2025-12-05 12:06:44.871 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.172 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.174 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=3978MB free_disk=72.98876953125GB free_vcpus=-3 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.174 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.175 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.272 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 24358eea-14fb-4863-a6c4-aadcdb495f54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.273 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 472c7e2c-bdad-4230-904b-6937ceb872d2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.273 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.274 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 8888dd78-1c78-4065-8536-9a1096bdf57b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.274 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance b81bb939-d14f-4a72-b7fe-95fc5d8810a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.274 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 297d72ef-6b79-45b3-813b-52b5144b522e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.274 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 25918fc4-05ec-4a16-b77f-ca1d352a2763 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.275 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance bcdca3f9-3e24-4209-808c-8093b55e5c2d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.275 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 5d70ac2d-111f-4e1b-ac26-3e02849b0458 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.275 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance e9f9bf08-7688-4213-91ff-74f2271ec71d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.276 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance ed00d159-9d70-481e-93be-ea180fea04ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.278 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 11 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.278 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1984MB phys_disk=79GB used_disk=11GB total_vcpus=8 used_vcpus=11 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.388 187212 DEBUG nova.network.neutron [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.417 187212 INFO nova.compute.manager [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Took 1.29 seconds to deallocate network for instance.#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.469 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.490 187212 DEBUG nova.network.neutron [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updated VIF entry in instance network info cache for port 48b30c48-7858-408b-aeab-df46f6277546. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.491 187212 DEBUG nova.network.neutron [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updating instance_info_cache with network_info: [{"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.510 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-e9f9bf08-7688-4213-91ff-74f2271ec71d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.511 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.511 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.512 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.512 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.513 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.513 187212 WARNING nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.513 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.514 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.514 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.514 187212 DEBUG oslo_concurrency.lockutils [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.515 187212 DEBUG nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.515 187212 WARNING nova.compute.manager [req-abbbe80a-3994-4e75-b36f-a128e9bf1ceb req-50ce43c8-79bf-4b72-8a2a-b9e6efa76189 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.582 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.991 187212 DEBUG nova.network.neutron [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updated VIF entry in instance network info cache for port 821e6243-8d28-4c8c-874c-f1e69c7d3bed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:45 np0005546909 nova_compute[187208]: 2025-12-05 12:06:45.992 187212 DEBUG nova.network.neutron [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Updating instance_info_cache with network_info: [{"id": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "address": "fa:16:3e:a6:47:26", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap821e6243-8d", "ovs_interfaceid": "821e6243-8d28-4c8c-874c-f1e69c7d3bed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.075 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.081 187212 DEBUG oslo_concurrency.lockutils [req-d3e5bb84-328d-425c-937a-e710634463f7 req-2b25cdee-3ed9-469b-ba3b-ab18f512cbec 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-297d72ef-6b79-45b3-813b-52b5144b522e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.104 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.105 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.106 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:46.173 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.294 187212 DEBUG nova.compute.provider_tree [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.381 187212 DEBUG nova.scheduler.client.report [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.411 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.521 187212 INFO nova.scheduler.client.report [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Deleted allocations for instance 297d72ef-6b79-45b3-813b-52b5144b522e#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.614 187212 DEBUG oslo_concurrency.lockutils [None req-db8bfb4b-1786-4635-a5ad-880f5c3d577c 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.691 187212 DEBUG oslo_concurrency.lockutils [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-25918fc4-05ec-4a16-b77f-ca1d352a2763-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.692 187212 DEBUG oslo_concurrency.lockutils [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-25918fc4-05ec-4a16-b77f-ca1d352a2763-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:46 np0005546909 nova_compute[187208]: 2025-12-05 12:06:46.692 187212 DEBUG nova.objects.instance [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid 25918fc4-05ec-4a16-b77f-ca1d352a2763 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:47 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.101 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:06:47 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.102 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:06:47 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.135 187212 DEBUG nova.objects.instance [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_requests' on Instance uuid 25918fc4-05ec-4a16-b77f-ca1d352a2763 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:47 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.152 187212 DEBUG nova.network.neutron [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:06:47 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.337 187212 DEBUG nova.policy [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:06:47 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.671 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:47Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:62:bb:58 10.100.0.8
Dec  5 07:06:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:47Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:62:bb:58 10.100.0.8
Dec  5 07:06:47 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.896 187212 DEBUG nova.compute.manager [req-0ffb74e3-ad1a-4064-8d0e-d8cf6de2e1a9 req-326e5c53-c1b3-4eea-8ea7-8b2a4a3ab2e8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-vif-unplugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:47 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.897 187212 DEBUG oslo_concurrency.lockutils [req-0ffb74e3-ad1a-4064-8d0e-d8cf6de2e1a9 req-326e5c53-c1b3-4eea-8ea7-8b2a4a3ab2e8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:47 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.898 187212 DEBUG oslo_concurrency.lockutils [req-0ffb74e3-ad1a-4064-8d0e-d8cf6de2e1a9 req-326e5c53-c1b3-4eea-8ea7-8b2a4a3ab2e8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:47 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.898 187212 DEBUG oslo_concurrency.lockutils [req-0ffb74e3-ad1a-4064-8d0e-d8cf6de2e1a9 req-326e5c53-c1b3-4eea-8ea7-8b2a4a3ab2e8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:47 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.898 187212 DEBUG nova.compute.manager [req-0ffb74e3-ad1a-4064-8d0e-d8cf6de2e1a9 req-326e5c53-c1b3-4eea-8ea7-8b2a4a3ab2e8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] No waiting events found dispatching network-vif-unplugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:47 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.899 187212 WARNING nova.compute.manager [req-0ffb74e3-ad1a-4064-8d0e-d8cf6de2e1a9 req-326e5c53-c1b3-4eea-8ea7-8b2a4a3ab2e8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received unexpected event network-vif-unplugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:06:48 np0005546909 nova_compute[187208]: 2025-12-05 12:06:47.999 187212 DEBUG nova.network.neutron [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Successfully created port: 8749491f-af83-499c-b823-14496cf1872d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:06:48 np0005546909 nova_compute[187208]: 2025-12-05 12:06:48.905 187212 DEBUG nova.network.neutron [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Successfully updated port: 8749491f-af83-499c-b823-14496cf1872d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:06:48 np0005546909 nova_compute[187208]: 2025-12-05 12:06:48.931 187212 DEBUG oslo_concurrency.lockutils [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:48 np0005546909 nova_compute[187208]: 2025-12-05 12:06:48.932 187212 DEBUG oslo_concurrency.lockutils [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:48 np0005546909 nova_compute[187208]: 2025-12-05 12:06:48.932 187212 DEBUG nova.network.neutron [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:06:49 np0005546909 nova_compute[187208]: 2025-12-05 12:06:49.050 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:49 np0005546909 nova_compute[187208]: 2025-12-05 12:06:49.101 187212 WARNING nova.network.neutron [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it#033[00m
Dec  5 07:06:50 np0005546909 nova_compute[187208]: 2025-12-05 12:06:50.197 187212 DEBUG nova.compute.manager [req-7b28ae3d-2feb-4d2b-8b25-52d37732b78e req-3e2b296e-1e46-47c5-9810-292404045315 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-vif-deleted-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:50 np0005546909 nova_compute[187208]: 2025-12-05 12:06:50.197 187212 INFO nova.compute.manager [req-7b28ae3d-2feb-4d2b-8b25-52d37732b78e req-3e2b296e-1e46-47c5-9810-292404045315 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Neutron deleted interface 821e6243-8d28-4c8c-874c-f1e69c7d3bed; detaching it from the instance and deleting it from the info cache#033[00m
Dec  5 07:06:50 np0005546909 nova_compute[187208]: 2025-12-05 12:06:50.197 187212 DEBUG nova.network.neutron [req-7b28ae3d-2feb-4d2b-8b25-52d37732b78e req-3e2b296e-1e46-47c5-9810-292404045315 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Dec  5 07:06:50 np0005546909 nova_compute[187208]: 2025-12-05 12:06:50.200 187212 DEBUG nova.compute.manager [req-7b28ae3d-2feb-4d2b-8b25-52d37732b78e req-3e2b296e-1e46-47c5-9810-292404045315 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Detach interface failed, port_id=821e6243-8d28-4c8c-874c-f1e69c7d3bed, reason: Instance 297d72ef-6b79-45b3-813b-52b5144b522e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Dec  5 07:06:50 np0005546909 podman[227340]: 2025-12-05 12:06:50.227262053 +0000 UTC m=+0.075407309 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.243 187212 DEBUG nova.network.neutron [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.472 187212 DEBUG oslo_concurrency.lockutils [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.475 187212 DEBUG nova.virt.libvirt.vif [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.476 187212 DEBUG nova.network.os_vif_util [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.477 187212 DEBUG nova.network.os_vif_util [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.477 187212 DEBUG os_vif [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.478 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.478 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.479 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.481 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.481 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8749491f-af, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.482 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8749491f-af, col_values=(('external_ids', {'iface-id': '8749491f-af83-499c-b823-14496cf1872d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:50:d2', 'vm-uuid': '25918fc4-05ec-4a16-b77f-ca1d352a2763'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.484 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:52 np0005546909 NetworkManager[55691]: <info>  [1764936412.4852] manager: (tap8749491f-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/222)
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.487 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.489 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.491 187212 INFO os_vif [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af')#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.491 187212 DEBUG nova.virt.libvirt.vif [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.492 187212 DEBUG nova.network.os_vif_util [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.492 187212 DEBUG nova.network.os_vif_util [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.495 187212 DEBUG nova.virt.libvirt.guest [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] attach device xml: <interface type="ethernet">
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  <mac address="fa:16:3e:83:50:d2"/>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  <model type="virtio"/>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  <mtu size="1442"/>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  <target dev="tap8749491f-af"/>
Dec  5 07:06:52 np0005546909 nova_compute[187208]: </interface>
Dec  5 07:06:52 np0005546909 nova_compute[187208]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Dec  5 07:06:52 np0005546909 kernel: tap8749491f-af: entered promiscuous mode
Dec  5 07:06:52 np0005546909 NetworkManager[55691]: <info>  [1764936412.5137] manager: (tap8749491f-af): new Tun device (/org/freedesktop/NetworkManager/Devices/223)
Dec  5 07:06:52 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:52Z|00514|binding|INFO|Claiming lport 8749491f-af83-499c-b823-14496cf1872d for this chassis.
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:52 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:52Z|00515|binding|INFO|8749491f-af83-499c-b823-14496cf1872d: Claiming fa:16:3e:83:50:d2 10.100.0.14
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.516 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:52 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:52Z|00516|binding|INFO|Setting lport 8749491f-af83-499c-b823-14496cf1872d ovn-installed in OVS
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.534 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:52 np0005546909 systemd-udevd[227379]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.546 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:50:d2 10.100.0.14'], port_security=['fa:16:3e:83:50:d2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=8749491f-af83-499c-b823-14496cf1872d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:52 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:52Z|00517|binding|INFO|Setting lport 8749491f-af83-499c-b823-14496cf1872d up in Southbound
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.548 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 8749491f-af83-499c-b823-14496cf1872d in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis#033[00m
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.551 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c#033[00m
Dec  5 07:06:52 np0005546909 NetworkManager[55691]: <info>  [1764936412.5572] device (tap8749491f-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:06:52 np0005546909 NetworkManager[55691]: <info>  [1764936412.5592] device (tap8749491f-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.568 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cf2230c1-f93b-4ab4-a5ab-740e3a4741ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.604 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6f6b7613-375b-485b-97c7-116b255543e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.608 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[689c56a0-620d-43b4-9d93-a5bceabe7e4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.614 187212 DEBUG nova.virt.libvirt.driver [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.614 187212 DEBUG nova.virt.libvirt.driver [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.614 187212 DEBUG nova.virt.libvirt.driver [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:7b:68:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.615 187212 DEBUG nova.virt.libvirt.driver [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:83:50:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.637 187212 DEBUG nova.virt.libvirt.guest [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1604830094</nova:name>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  <nova:creationTime>2025-12-05 12:06:52</nova:creationTime>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  <nova:flavor name="m1.nano">
Dec  5 07:06:52 np0005546909 nova_compute[187208]:    <nova:memory>128</nova:memory>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:    <nova:disk>1</nova:disk>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:    <nova:swap>0</nova:swap>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:    <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:    <nova:vcpus>1</nova:vcpus>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  </nova:flavor>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  <nova:owner>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:    <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:    <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  </nova:owner>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  <nova:ports>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:    <nova:port uuid="2064bfa7-125e-466c-9365-6c0ec6655113">
Dec  5 07:06:52 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:    <nova:port uuid="8749491f-af83-499c-b823-14496cf1872d">
Dec  5 07:06:52 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:06:52 np0005546909 nova_compute[187208]:  </nova:ports>
Dec  5 07:06:52 np0005546909 nova_compute[187208]: </nova:instance>
Dec  5 07:06:52 np0005546909 nova_compute[187208]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.646 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f88d806e-6dd9-462a-85cc-d3d185f1de23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.666 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[55493d3d-31af-400a-97e0-f7af2146d062]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375484, 'reachable_time': 41038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227388, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.674 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.687 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3d00ca-837c-4a95-be62-86b59a27ec41]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375496, 'tstamp': 375496}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227392, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375499, 'tstamp': 375499}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227392, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.689 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.690 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:52 np0005546909 nova_compute[187208]: 2025-12-05 12:06:52.692 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.695 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.695 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.695 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:52.696 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:06:53 np0005546909 nova_compute[187208]: 2025-12-05 12:06:53.111 187212 DEBUG oslo_concurrency.lockutils [None req-97a63438-de8f-4cdc-b434-97840d93bd14 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-25918fc4-05ec-4a16-b77f-ca1d352a2763-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.419s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:53 np0005546909 nova_compute[187208]: 2025-12-05 12:06:53.281 187212 DEBUG nova.compute.manager [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:53 np0005546909 nova_compute[187208]: 2025-12-05 12:06:53.281 187212 DEBUG oslo_concurrency.lockutils [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:53 np0005546909 nova_compute[187208]: 2025-12-05 12:06:53.282 187212 DEBUG oslo_concurrency.lockutils [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:53 np0005546909 nova_compute[187208]: 2025-12-05 12:06:53.282 187212 DEBUG oslo_concurrency.lockutils [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "297d72ef-6b79-45b3-813b-52b5144b522e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:53 np0005546909 nova_compute[187208]: 2025-12-05 12:06:53.282 187212 DEBUG nova.compute.manager [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] No waiting events found dispatching network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:53 np0005546909 nova_compute[187208]: 2025-12-05 12:06:53.282 187212 WARNING nova.compute.manager [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Received unexpected event network-vif-plugged-821e6243-8d28-4c8c-874c-f1e69c7d3bed for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:06:53 np0005546909 nova_compute[187208]: 2025-12-05 12:06:53.283 187212 DEBUG nova.compute.manager [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-changed-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:53 np0005546909 nova_compute[187208]: 2025-12-05 12:06:53.283 187212 DEBUG nova.compute.manager [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Refreshing instance network info cache due to event network-changed-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:53 np0005546909 nova_compute[187208]: 2025-12-05 12:06:53.283 187212 DEBUG oslo_concurrency.lockutils [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:53 np0005546909 nova_compute[187208]: 2025-12-05 12:06:53.283 187212 DEBUG oslo_concurrency.lockutils [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:53 np0005546909 nova_compute[187208]: 2025-12-05 12:06:53.283 187212 DEBUG nova.network.neutron [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Refreshing network info cache for port d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:54Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:50:d2 10.100.0.14
Dec  5 07:06:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:54Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:50:d2 10.100.0.14
Dec  5 07:06:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:54Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cc:8d:e9 10.100.0.10
Dec  5 07:06:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:54Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cc:8d:e9 10.100.0.10
Dec  5 07:06:54 np0005546909 nova_compute[187208]: 2025-12-05 12:06:54.529 187212 DEBUG nova.compute.manager [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:54 np0005546909 nova_compute[187208]: 2025-12-05 12:06:54.530 187212 DEBUG nova.compute.manager [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing instance network info cache due to event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:54 np0005546909 nova_compute[187208]: 2025-12-05 12:06:54.530 187212 DEBUG oslo_concurrency.lockutils [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:54 np0005546909 nova_compute[187208]: 2025-12-05 12:06:54.531 187212 DEBUG oslo_concurrency.lockutils [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:54 np0005546909 nova_compute[187208]: 2025-12-05 12:06:54.531 187212 DEBUG nova.network.neutron [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:55 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:55Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:d7:ed 10.100.0.9
Dec  5 07:06:56 np0005546909 nova_compute[187208]: 2025-12-05 12:06:56.719 187212 DEBUG nova.network.neutron [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Updated VIF entry in instance network info cache for port d10caa85-dfcd-49ce-8ff7-2c2a68d1d731. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:56 np0005546909 nova_compute[187208]: 2025-12-05 12:06:56.719 187212 DEBUG nova.network.neutron [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Updating instance_info_cache with network_info: [{"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:56 np0005546909 nova_compute[187208]: 2025-12-05 12:06:56.748 187212 DEBUG oslo_concurrency.lockutils [req-84c18e9c-9b90-4229-b55a-dbf2882de14d req-5b0c047b-691b-4a11-b927-cf806953e056 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ed00d159-9d70-481e-93be-ea180fea04ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:56 np0005546909 nova_compute[187208]: 2025-12-05 12:06:56.861 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:56 np0005546909 nova_compute[187208]: 2025-12-05 12:06:56.862 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:56 np0005546909 nova_compute[187208]: 2025-12-05 12:06:56.862 187212 INFO nova.compute.manager [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Shelving#033[00m
Dec  5 07:06:56 np0005546909 nova_compute[187208]: 2025-12-05 12:06:56.890 187212 DEBUG nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:06:57 np0005546909 podman[227406]: 2025-12-05 12:06:57.215978914 +0000 UTC m=+0.061154196 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 07:06:57 np0005546909 podman[227405]: 2025-12-05 12:06:57.225457719 +0000 UTC m=+0.070820336 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec  5 07:06:57 np0005546909 nova_compute[187208]: 2025-12-05 12:06:57.460 187212 DEBUG nova.network.neutron [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updated VIF entry in instance network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:57 np0005546909 nova_compute[187208]: 2025-12-05 12:06:57.461 187212 DEBUG nova.network.neutron [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:57 np0005546909 nova_compute[187208]: 2025-12-05 12:06:57.485 187212 DEBUG oslo_concurrency.lockutils [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:57 np0005546909 nova_compute[187208]: 2025-12-05 12:06:57.485 187212 DEBUG nova.compute.manager [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-changed-8749491f-af83-499c-b823-14496cf1872d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:57 np0005546909 nova_compute[187208]: 2025-12-05 12:06:57.485 187212 DEBUG nova.compute.manager [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Refreshing instance network info cache due to event network-changed-8749491f-af83-499c-b823-14496cf1872d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:06:57 np0005546909 nova_compute[187208]: 2025-12-05 12:06:57.486 187212 DEBUG oslo_concurrency.lockutils [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:57 np0005546909 nova_compute[187208]: 2025-12-05 12:06:57.486 187212 DEBUG oslo_concurrency.lockutils [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:06:57 np0005546909 nova_compute[187208]: 2025-12-05 12:06:57.486 187212 DEBUG nova.network.neutron [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Refreshing network info cache for port 8749491f-af83-499c-b823-14496cf1872d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:06:57 np0005546909 nova_compute[187208]: 2025-12-05 12:06:57.487 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:57 np0005546909 nova_compute[187208]: 2025-12-05 12:06:57.677 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.592 187212 DEBUG nova.compute.manager [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.593 187212 DEBUG oslo_concurrency.lockutils [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.593 187212 DEBUG oslo_concurrency.lockutils [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.593 187212 DEBUG oslo_concurrency.lockutils [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.593 187212 DEBUG nova.compute.manager [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.594 187212 WARNING nova.compute.manager [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d for instance with vm_state active and task_state None.#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.594 187212 DEBUG nova.compute.manager [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.594 187212 DEBUG oslo_concurrency.lockutils [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.594 187212 DEBUG oslo_concurrency.lockutils [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.595 187212 DEBUG oslo_concurrency.lockutils [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.595 187212 DEBUG nova.compute.manager [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.595 187212 WARNING nova.compute.manager [req-636aea67-f82a-4d9c-86dd-012b04087dcd req-330bb041-51b3-46c9-a065-d83c90e40d71 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d for instance with vm_state active and task_state None.#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.696 187212 DEBUG nova.network.neutron [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updated VIF entry in instance network info cache for port 8749491f-af83-499c-b823-14496cf1872d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.697 187212 DEBUG nova.network.neutron [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.721 187212 DEBUG oslo_concurrency.lockutils [req-153b2403-8cc1-4d9e-8229-98179e8e1ce5 req-0d7bd116-990c-442a-a496-204d358d01b0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.990 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936403.9883554, 297d72ef-6b79-45b3-813b-52b5144b522e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:06:58 np0005546909 nova_compute[187208]: 2025-12-05 12:06:58.990 187212 INFO nova.compute.manager [-] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:06:59 np0005546909 nova_compute[187208]: 2025-12-05 12:06:59.017 187212 DEBUG nova.compute.manager [None req-fb1722fa-3dbb-49f5-b3b2-9433e42b8c95 - - - - - -] [instance: 297d72ef-6b79-45b3-813b-52b5144b522e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:06:59 np0005546909 kernel: tapac02dd63-5a (unregistering): left promiscuous mode
Dec  5 07:06:59 np0005546909 NetworkManager[55691]: <info>  [1764936419.0848] device (tapac02dd63-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:06:59 np0005546909 nova_compute[187208]: 2025-12-05 12:06:59.088 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:59 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:59Z|00518|binding|INFO|Releasing lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b from this chassis (sb_readonly=0)
Dec  5 07:06:59 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:59Z|00519|binding|INFO|Setting lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b down in Southbound
Dec  5 07:06:59 np0005546909 ovn_controller[95610]: 2025-12-05T12:06:59Z|00520|binding|INFO|Removing iface tapac02dd63-5a ovn-installed in OVS
Dec  5 07:06:59 np0005546909 nova_compute[187208]: 2025-12-05 12:06:59.098 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.102 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:c5:99 10.100.0.8'], port_security=['fa:16:3e:6a:c5:99 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d62df5807554f499d26b5fc77ec8603', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5a04f4af-e81b-4661-95ed-5737ffc98cae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a7d298f-265e-44c5-a73a-18dd9ed0b171, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.104 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b in datapath fc6ce614-d0f7-413f-bc3e-26f7271993d9 unbound from our chassis#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.106 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc6ce614-d0f7-413f-bc3e-26f7271993d9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.109 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e91fa82a-1db2-4f34-99c8-685ff480d1e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.109 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 namespace which is not needed anymore#033[00m
Dec  5 07:06:59 np0005546909 nova_compute[187208]: 2025-12-05 12:06:59.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:59 np0005546909 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Dec  5 07:06:59 np0005546909 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000003e.scope: Consumed 15.067s CPU time.
Dec  5 07:06:59 np0005546909 systemd-machined[153543]: Machine qemu-65-instance-0000003e terminated.
Dec  5 07:06:59 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [NOTICE]   (226438) : haproxy version is 2.8.14-c23fe91
Dec  5 07:06:59 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [NOTICE]   (226438) : path to executable is /usr/sbin/haproxy
Dec  5 07:06:59 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [WARNING]  (226438) : Exiting Master process...
Dec  5 07:06:59 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [ALERT]    (226438) : Current worker (226440) exited with code 143 (Terminated)
Dec  5 07:06:59 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[226434]: [WARNING]  (226438) : All workers exited. Exiting... (0)
Dec  5 07:06:59 np0005546909 systemd[1]: libpod-161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c.scope: Deactivated successfully.
Dec  5 07:06:59 np0005546909 podman[227469]: 2025-12-05 12:06:59.248922095 +0000 UTC m=+0.042402241 container died 161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:06:59 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c-userdata-shm.mount: Deactivated successfully.
Dec  5 07:06:59 np0005546909 systemd[1]: var-lib-containers-storage-overlay-a1f29ee6b8791a73cdc634bfdec56c186b809e5d9ad3d5a603996e4c487e56a0-merged.mount: Deactivated successfully.
Dec  5 07:06:59 np0005546909 podman[227469]: 2025-12-05 12:06:59.290909534 +0000 UTC m=+0.084389680 container cleanup 161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  5 07:06:59 np0005546909 systemd[1]: libpod-conmon-161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c.scope: Deactivated successfully.
Dec  5 07:06:59 np0005546909 podman[227499]: 2025-12-05 12:06:59.360234975 +0000 UTC m=+0.045743408 container remove 161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.367 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[10d69b94-5ebb-4427-b1b2-19bc26d9e5f0]: (4, ('Fri Dec  5 12:06:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 (161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c)\n161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c\nFri Dec  5 12:06:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 (161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c)\n161daad73771a3b408f3b222704f8ec138ef5f90c0ab58ba8cfc04a56628d91c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.369 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ac9e4e73-ac02-40fd-9fbf-da42a8e986a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.370 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc6ce614-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:06:59 np0005546909 nova_compute[187208]: 2025-12-05 12:06:59.372 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:59 np0005546909 kernel: tapfc6ce614-d0: left promiscuous mode
Dec  5 07:06:59 np0005546909 nova_compute[187208]: 2025-12-05 12:06:59.387 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.391 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ce581071-182b-4a00-bf88-0783172f027c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.407 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0345c101-f181-446e-8ad4-19c826dfecc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.409 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[91e883af-6979-4d4b-ad06-247f05f2240c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.426 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2789a0da-6c83-4c17-b254-0aa8c4441732]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375872, 'reachable_time': 21824, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227531, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.429 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:06:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:06:59.430 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[76c32303-a1b2-478b-a633-bcdf9c7ac9f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:06:59 np0005546909 systemd[1]: run-netns-ovnmeta\x2dfc6ce614\x2dd0f7\x2d413f\x2dbc3e\x2d26f7271993d9.mount: Deactivated successfully.
Dec  5 07:06:59 np0005546909 nova_compute[187208]: 2025-12-05 12:06:59.908 187212 INFO nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance shutdown successfully after 3 seconds.#033[00m
Dec  5 07:06:59 np0005546909 nova_compute[187208]: 2025-12-05 12:06:59.914 187212 INFO nova.virt.libvirt.driver [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance destroyed successfully.#033[00m
Dec  5 07:06:59 np0005546909 nova_compute[187208]: 2025-12-05 12:06:59.915 187212 DEBUG nova.objects.instance [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'numa_topology' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:59 np0005546909 nova_compute[187208]: 2025-12-05 12:06:59.951 187212 DEBUG nova.objects.instance [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lazy-loading 'flavor' on Instance uuid bcdca3f9-3e24-4209-808c-8093b55e5c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:06:59 np0005546909 nova_compute[187208]: 2025-12-05 12:06:59.984 187212 DEBUG oslo_concurrency.lockutils [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:06:59 np0005546909 nova_compute[187208]: 2025-12-05 12:06:59.984 187212 DEBUG oslo_concurrency.lockutils [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:00 np0005546909 nova_compute[187208]: 2025-12-05 12:07:00.201 187212 INFO nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Beginning cold snapshot process#033[00m
Dec  5 07:07:00 np0005546909 nova_compute[187208]: 2025-12-05 12:07:00.279 187212 DEBUG nova.compute.manager [req-4af47932-2a4c-4c29-a09e-226a4352de9d req-02ff40ac-8440-470e-a454-3f49534e8410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-unplugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:00 np0005546909 nova_compute[187208]: 2025-12-05 12:07:00.279 187212 DEBUG oslo_concurrency.lockutils [req-4af47932-2a4c-4c29-a09e-226a4352de9d req-02ff40ac-8440-470e-a454-3f49534e8410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:00 np0005546909 nova_compute[187208]: 2025-12-05 12:07:00.279 187212 DEBUG oslo_concurrency.lockutils [req-4af47932-2a4c-4c29-a09e-226a4352de9d req-02ff40ac-8440-470e-a454-3f49534e8410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:00 np0005546909 nova_compute[187208]: 2025-12-05 12:07:00.280 187212 DEBUG oslo_concurrency.lockutils [req-4af47932-2a4c-4c29-a09e-226a4352de9d req-02ff40ac-8440-470e-a454-3f49534e8410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:00 np0005546909 nova_compute[187208]: 2025-12-05 12:07:00.280 187212 DEBUG nova.compute.manager [req-4af47932-2a4c-4c29-a09e-226a4352de9d req-02ff40ac-8440-470e-a454-3f49534e8410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] No waiting events found dispatching network-vif-unplugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:00 np0005546909 nova_compute[187208]: 2025-12-05 12:07:00.280 187212 WARNING nova.compute.manager [req-4af47932-2a4c-4c29-a09e-226a4352de9d req-02ff40ac-8440-470e-a454-3f49534e8410 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received unexpected event network-vif-unplugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for instance with vm_state active and task_state shelving_image_pending_upload.#033[00m
Dec  5 07:07:00 np0005546909 nova_compute[187208]: 2025-12-05 12:07:00.367 187212 DEBUG nova.privsep.utils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:07:00 np0005546909 nova_compute[187208]: 2025-12-05 12:07:00.368 187212 DEBUG oslo_concurrency.processutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk /var/lib/nova/instances/snapshots/tmp5j8wm6k8/adb9d9dddeed4a54b09e9bedf8cd0b63 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:00 np0005546909 nova_compute[187208]: 2025-12-05 12:07:00.994 187212 DEBUG oslo_concurrency.processutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk /var/lib/nova/instances/snapshots/tmp5j8wm6k8/adb9d9dddeed4a54b09e9bedf8cd0b63" returned: 0 in 0.626s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:00 np0005546909 nova_compute[187208]: 2025-12-05 12:07:00.995 187212 INFO nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.405 187212 DEBUG nova.network.neutron [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.620 187212 DEBUG oslo_concurrency.lockutils [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-25918fc4-05ec-4a16-b77f-ca1d352a2763-8749491f-af83-499c-b823-14496cf1872d" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.620 187212 DEBUG oslo_concurrency.lockutils [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-25918fc4-05ec-4a16-b77f-ca1d352a2763-8749491f-af83-499c-b823-14496cf1872d" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.643 187212 DEBUG nova.objects.instance [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid 25918fc4-05ec-4a16-b77f-ca1d352a2763 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.672 187212 DEBUG nova.virt.libvirt.vif [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.673 187212 DEBUG nova.network.os_vif_util [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.675 187212 DEBUG nova.network.os_vif_util [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.680 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:83:50:d2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8749491f-af"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.681 187212 DEBUG nova.compute.manager [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.682 187212 DEBUG nova.compute.manager [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing instance network info cache due to event network-changed-9357c6a6-eb6f-4ab9-bfd6-486765004ac5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.682 187212 DEBUG oslo_concurrency.lockutils [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.683 187212 DEBUG oslo_concurrency.lockutils [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.683 187212 DEBUG nova.network.neutron [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Refreshing network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.686 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:83:50:d2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8749491f-af"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.688 187212 DEBUG nova.virt.libvirt.driver [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Attempting to detach device tap8749491f-af from instance 25918fc4-05ec-4a16-b77f-ca1d352a2763 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.688 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] detach device xml: <interface type="ethernet">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <mac address="fa:16:3e:83:50:d2"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <model type="virtio"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <mtu size="1442"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <target dev="tap8749491f-af"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]: </interface>
Dec  5 07:07:01 np0005546909 nova_compute[187208]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.758 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:83:50:d2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8749491f-af"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.763 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:83:50:d2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8749491f-af"/></interface>not found in domain: <domain type='kvm' id='64'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <name>instance-0000003c</name>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <uuid>25918fc4-05ec-4a16-b77f-ca1d352a2763</uuid>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1604830094</nova:name>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:creationTime>2025-12-05 12:06:52</nova:creationTime>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:flavor name="m1.nano">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:memory>128</nova:memory>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:disk>1</nova:disk>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:swap>0</nova:swap>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:vcpus>1</nova:vcpus>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </nova:flavor>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:owner>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </nova:owner>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:ports>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:port uuid="2064bfa7-125e-466c-9365-6c0ec6655113">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:port uuid="8749491f-af83-499c-b823-14496cf1872d">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </nova:ports>
Dec  5 07:07:01 np0005546909 nova_compute[187208]: </nova:instance>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <memory unit='KiB'>131072</memory>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <vcpu placement='static'>1</vcpu>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <resource>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <partition>/machine</partition>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </resource>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <sysinfo type='smbios'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <entry name='manufacturer'>RDO</entry>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <entry name='product'>OpenStack Compute</entry>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <entry name='serial'>25918fc4-05ec-4a16-b77f-ca1d352a2763</entry>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <entry name='uuid'>25918fc4-05ec-4a16-b77f-ca1d352a2763</entry>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <entry name='family'>Virtual Machine</entry>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <boot dev='hd'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <smbios mode='sysinfo'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <vmcoreinfo state='on'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <cpu mode='custom' match='exact' check='full'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <model fallback='forbid'>EPYC-Rome</model>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <vendor>AMD</vendor>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='x2apic'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='tsc-deadline'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='hypervisor'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='tsc_adjust'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='spec-ctrl'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='stibp'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='ssbd'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='cmp_legacy'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='overflow-recov'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='succor'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='ibrs'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='amd-ssbd'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='virt-ssbd'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='lbrv'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='tsc-scale'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='vmcb-clean'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='flushbyasid'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='pause-filter'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='pfthreshold'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='svme-addr-chk'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='lfence-always-serializing'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='xsaves'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='svm'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='topoext'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='npt'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='nrip-save'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <clock offset='utc'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <timer name='pit' tickpolicy='delay'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <timer name='rtc' tickpolicy='catchup'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <timer name='hpet' present='no'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <on_poweroff>destroy</on_poweroff>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <on_reboot>restart</on_reboot>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <on_crash>destroy</on_crash>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <disk type='file' device='disk'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <driver name='qemu' type='qcow2' cache='none'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <source file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk' index='2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <backingStore type='file' index='3'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:        <format type='raw'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:        <source file='/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:        <backingStore/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      </backingStore>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target dev='vda' bus='virtio'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='virtio-disk0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <disk type='file' device='cdrom'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <driver name='qemu' type='raw' cache='none'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <source file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.config' index='1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <backingStore/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target dev='sda' bus='sata'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <readonly/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='sata0-0-0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='0' model='pcie-root'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pcie.0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='1' port='0x10'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='2' port='0x11'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='3' port='0x12'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.3'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='4' port='0x13'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.4'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='5' port='0x14'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.5'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='6' port='0x15'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.6'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='7' port='0x16'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.7'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='8' port='0x17'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.8'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='9' port='0x18'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.9'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='10' port='0x19'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.10'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='11' port='0x1a'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.11'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='12' port='0x1b'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.12'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='13' port='0x1c'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.13'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='14' port='0x1d'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.14'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='15' port='0x1e'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.15'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='16' port='0x1f'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.16'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='17' port='0x20'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.17'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='18' port='0x21'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.18'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='19' port='0x22'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.19'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='20' port='0x23'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.20'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='21' port='0x24'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.21'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='22' port='0x25'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.22'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='23' port='0x26'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.23'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='24' port='0x27'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.24'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='25' port='0x28'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.25'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-pci-bridge'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.26'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='usb'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='sata' index='0'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='ide'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <interface type='ethernet'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <mac address='fa:16:3e:7b:68:b7'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target dev='tap2064bfa7-12'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model type='virtio'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <driver name='vhost' rx_queue_size='512'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <mtu size='1442'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='net0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <interface type='ethernet'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <mac address='fa:16:3e:83:50:d2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target dev='tap8749491f-af'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model type='virtio'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <driver name='vhost' rx_queue_size='512'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <mtu size='1442'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='net1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <serial type='pty'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <source path='/dev/pts/6'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <log file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/console.log' append='off'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target type='isa-serial' port='0'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:        <model name='isa-serial'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      </target>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='serial0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <console type='pty' tty='/dev/pts/6'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <source path='/dev/pts/6'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <log file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/console.log' append='off'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target type='serial' port='0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='serial0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </console>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <input type='tablet' bus='usb'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='input0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='usb' bus='0' port='1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </input>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <input type='mouse' bus='ps2'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='input1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </input>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <input type='keyboard' bus='ps2'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='input2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </input>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <graphics type='vnc' port='5906' autoport='yes' listen='::0'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <listen type='address' address='::0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </graphics>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <audio id='1' type='none'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model type='virtio' heads='1' primary='yes'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='video0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <watchdog model='itco' action='reset'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='watchdog0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </watchdog>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <memballoon model='virtio'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <stats period='10'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='balloon0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <rng model='virtio'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <backend model='random'>/dev/urandom</backend>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='rng0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <label>system_u:system_r:svirt_t:s0:c16,c952</label>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c16,c952</imagelabel>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </seclabel>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <label>+107:+107</label>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <imagelabel>+107:+107</imagelabel>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </seclabel>
Dec  5 07:07:01 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:07:01 np0005546909 nova_compute[187208]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.763 187212 INFO nova.virt.libvirt.driver [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully detached device tap8749491f-af from instance 25918fc4-05ec-4a16-b77f-ca1d352a2763 from the persistent domain config.#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.764 187212 DEBUG nova.virt.libvirt.driver [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] (1/8): Attempting to detach device tap8749491f-af with device alias net1 from instance 25918fc4-05ec-4a16-b77f-ca1d352a2763 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.764 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] detach device xml: <interface type="ethernet">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <mac address="fa:16:3e:83:50:d2"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <model type="virtio"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <mtu size="1442"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <target dev="tap8749491f-af"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]: </interface>
Dec  5 07:07:01 np0005546909 nova_compute[187208]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Dec  5 07:07:01 np0005546909 kernel: tap8749491f-af (unregistering): left promiscuous mode
Dec  5 07:07:01 np0005546909 NetworkManager[55691]: <info>  [1764936421.8760] device (tap8749491f-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.884 187212 DEBUG nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Received event <DeviceRemovedEvent: 1764936421.8838403, 25918fc4-05ec-4a16-b77f-ca1d352a2763 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.885 187212 DEBUG nova.virt.libvirt.driver [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Start waiting for the detach event from libvirt for device tap8749491f-af with device alias net1 for instance 25918fc4-05ec-4a16-b77f-ca1d352a2763 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.886 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:83:50:d2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8749491f-af"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  5 07:07:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:01Z|00521|binding|INFO|Releasing lport 8749491f-af83-499c-b823-14496cf1872d from this chassis (sb_readonly=0)
Dec  5 07:07:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:01Z|00522|binding|INFO|Setting lport 8749491f-af83-499c-b823-14496cf1872d down in Southbound
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.888 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:01Z|00523|binding|INFO|Removing iface tap8749491f-af ovn-installed in OVS
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.891 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.892 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:83:50:d2"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap8749491f-af"/></interface>not found in domain: <domain type='kvm' id='64'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <name>instance-0000003c</name>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <uuid>25918fc4-05ec-4a16-b77f-ca1d352a2763</uuid>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1604830094</nova:name>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:creationTime>2025-12-05 12:06:52</nova:creationTime>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:flavor name="m1.nano">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:memory>128</nova:memory>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:disk>1</nova:disk>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:swap>0</nova:swap>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:vcpus>1</nova:vcpus>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </nova:flavor>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:owner>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </nova:owner>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:ports>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:port uuid="2064bfa7-125e-466c-9365-6c0ec6655113">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:port uuid="8749491f-af83-499c-b823-14496cf1872d">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </nova:ports>
Dec  5 07:07:01 np0005546909 nova_compute[187208]: </nova:instance>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <memory unit='KiB'>131072</memory>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <vcpu placement='static'>1</vcpu>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <resource>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <partition>/machine</partition>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </resource>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <sysinfo type='smbios'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <entry name='manufacturer'>RDO</entry>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <entry name='product'>OpenStack Compute</entry>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <entry name='serial'>25918fc4-05ec-4a16-b77f-ca1d352a2763</entry>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <entry name='uuid'>25918fc4-05ec-4a16-b77f-ca1d352a2763</entry>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <entry name='family'>Virtual Machine</entry>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <boot dev='hd'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <smbios mode='sysinfo'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <vmcoreinfo state='on'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <cpu mode='custom' match='exact' check='full'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <model fallback='forbid'>EPYC-Rome</model>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <vendor>AMD</vendor>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='x2apic'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='tsc-deadline'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='hypervisor'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='tsc_adjust'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='spec-ctrl'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='stibp'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='ssbd'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='cmp_legacy'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='overflow-recov'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='succor'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='ibrs'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='amd-ssbd'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='virt-ssbd'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='lbrv'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='tsc-scale'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='vmcb-clean'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='flushbyasid'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='pause-filter'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='pfthreshold'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='svme-addr-chk'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='lfence-always-serializing'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='xsaves'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='svm'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='require' name='topoext'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='npt'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <feature policy='disable' name='nrip-save'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <clock offset='utc'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <timer name='pit' tickpolicy='delay'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <timer name='rtc' tickpolicy='catchup'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <timer name='hpet' present='no'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <on_poweroff>destroy</on_poweroff>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <on_reboot>restart</on_reboot>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <on_crash>destroy</on_crash>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <disk type='file' device='disk'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <driver name='qemu' type='qcow2' cache='none'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <source file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk' index='2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <backingStore type='file' index='3'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:        <format type='raw'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:        <source file='/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:        <backingStore/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      </backingStore>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target dev='vda' bus='virtio'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='virtio-disk0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <disk type='file' device='cdrom'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <driver name='qemu' type='raw' cache='none'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <source file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/disk.config' index='1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <backingStore/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target dev='sda' bus='sata'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <readonly/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='sata0-0-0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='0' model='pcie-root'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pcie.0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='1' port='0x10'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='2' port='0x11'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='3' port='0x12'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.3'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='4' port='0x13'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.4'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='5' port='0x14'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.5'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='6' port='0x15'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.6'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='7' port='0x16'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.7'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='8' port='0x17'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.8'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='9' port='0x18'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.9'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='10' port='0x19'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.10'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='11' port='0x1a'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.11'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='12' port='0x1b'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.12'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='13' port='0x1c'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.13'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='14' port='0x1d'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.14'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='15' port='0x1e'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.15'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='16' port='0x1f'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.16'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='17' port='0x20'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.17'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='18' port='0x21'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.18'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='19' port='0x22'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.19'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='20' port='0x23'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.20'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='21' port='0x24'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.21'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='22' port='0x25'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.22'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='23' port='0x26'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.23'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='24' port='0x27'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.24'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target chassis='25' port='0x28'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.25'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model name='pcie-pci-bridge'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='pci.26'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='usb'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <controller type='sata' index='0'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='ide'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <interface type='ethernet'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <mac address='fa:16:3e:7b:68:b7'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target dev='tap2064bfa7-12'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model type='virtio'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <driver name='vhost' rx_queue_size='512'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <mtu size='1442'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='net0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <serial type='pty'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <source path='/dev/pts/6'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <log file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/console.log' append='off'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target type='isa-serial' port='0'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:        <model name='isa-serial'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      </target>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='serial0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <console type='pty' tty='/dev/pts/6'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <source path='/dev/pts/6'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <log file='/var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763/console.log' append='off'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <target type='serial' port='0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='serial0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </console>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <input type='tablet' bus='usb'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='input0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='usb' bus='0' port='1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </input>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <input type='mouse' bus='ps2'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='input1'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </input>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <input type='keyboard' bus='ps2'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='input2'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </input>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <graphics type='vnc' port='5906' autoport='yes' listen='::0'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <listen type='address' address='::0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </graphics>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <audio id='1' type='none'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <model type='virtio' heads='1' primary='yes'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='video0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <watchdog model='itco' action='reset'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='watchdog0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </watchdog>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <memballoon model='virtio'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <stats period='10'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='balloon0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <rng model='virtio'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <backend model='random'>/dev/urandom</backend>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <alias name='rng0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <label>system_u:system_r:svirt_t:s0:c16,c952</label>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c16,c952</imagelabel>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </seclabel>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <label>+107:+107</label>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <imagelabel>+107:+107</imagelabel>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </seclabel>
Dec  5 07:07:01 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:07:01 np0005546909 nova_compute[187208]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.892 187212 INFO nova.virt.libvirt.driver [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully detached device tap8749491f-af from instance 25918fc4-05ec-4a16-b77f-ca1d352a2763 from the live domain config.#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.893 187212 DEBUG nova.virt.libvirt.vif [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.893 187212 DEBUG nova.network.os_vif_util [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "8749491f-af83-499c-b823-14496cf1872d", "address": "fa:16:3e:83:50:d2", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8749491f-af", "ovs_interfaceid": "8749491f-af83-499c-b823-14496cf1872d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.894 187212 DEBUG nova.network.os_vif_util [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.894 187212 DEBUG os_vif [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.895 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.896 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8749491f-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.899 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:07:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.903 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:50:d2 10.100.0.14'], port_security=['fa:16:3e:83:50:d2 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=8749491f-af83-499c-b823-14496cf1872d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.904 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.905 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 8749491f-af83-499c-b823-14496cf1872d in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis#033[00m
Dec  5 07:07:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.907 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.908 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.911 187212 INFO os_vif [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:50:d2,bridge_name='br-int',has_traffic_filtering=True,id=8749491f-af83-499c-b823-14496cf1872d,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8749491f-af')#033[00m
Dec  5 07:07:01 np0005546909 nova_compute[187208]: 2025-12-05 12:07:01.912 187212 DEBUG nova.virt.libvirt.guest [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1604830094</nova:name>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:creationTime>2025-12-05 12:07:01</nova:creationTime>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:flavor name="m1.nano">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:memory>128</nova:memory>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:disk>1</nova:disk>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:swap>0</nova:swap>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:vcpus>1</nova:vcpus>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </nova:flavor>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:owner>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </nova:owner>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  <nova:ports>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    <nova:port uuid="2064bfa7-125e-466c-9365-6c0ec6655113">
Dec  5 07:07:01 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:07:01 np0005546909 nova_compute[187208]:  </nova:ports>
Dec  5 07:07:01 np0005546909 nova_compute[187208]: </nova:instance>
Dec  5 07:07:01 np0005546909 nova_compute[187208]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec  5 07:07:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.921 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ad9df379-2cd7-4ad2-80a1-e8dc17d4cdc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.945 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e5607d35-b01e-4a17-87ad-789c3d2a2024]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.950 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[33d6b5de-7a87-4626-b4eb-ed1ea1b7fc85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:01.980 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[655aa3ff-4a45-456f-b076-d86fb7142b10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.000 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[57f4937b-9c9d-4d66-a51c-ecb287da83e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375484, 'reachable_time': 41038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227568, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.016 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7c64100c-21f8-4494-a07b-554c4873f062]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375496, 'tstamp': 375496}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227569, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375499, 'tstamp': 375499}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227569, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.018 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:02 np0005546909 nova_compute[187208]: 2025-12-05 12:07:02.021 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:02 np0005546909 nova_compute[187208]: 2025-12-05 12:07:02.025 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.026 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.026 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.026 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:02.027 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:02 np0005546909 nova_compute[187208]: 2025-12-05 12:07:02.681 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:02 np0005546909 nova_compute[187208]: 2025-12-05 12:07:02.844 187212 DEBUG oslo_concurrency.lockutils [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:02 np0005546909 nova_compute[187208]: 2025-12-05 12:07:02.844 187212 DEBUG oslo_concurrency.lockutils [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:02 np0005546909 nova_compute[187208]: 2025-12-05 12:07:02.844 187212 DEBUG nova.network.neutron [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:07:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:03.013 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:03.014 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:03.015 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.018 187212 DEBUG nova.network.neutron [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.081 187212 DEBUG oslo_concurrency.lockutils [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.081 187212 DEBUG nova.compute.manager [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.082 187212 DEBUG nova.compute.manager [None req-e6add906-993d-419d-b9be-9c15c98dd5f6 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] network_info to inject: |[{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Dec  5 07:07:03 np0005546909 podman[227571]: 2025-12-05 12:07:03.214538381 +0000 UTC m=+0.062841575 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:07:03 np0005546909 podman[227572]: 2025-12-05 12:07:03.244011296 +0000 UTC m=+0.089750656 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.433 187212 DEBUG nova.network.neutron [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updated VIF entry in instance network info cache for port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.435 187212 DEBUG nova.network.neutron [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [{"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.456 187212 DEBUG oslo_concurrency.lockutils [req-4ebe727f-9679-44eb-83e3-17361f847b8f req-24d49f19-2259-4853-bfe4-80bce74a72d9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-472c7e2c-bdad-4230-904b-6937ceb872d2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.644 187212 DEBUG nova.compute.manager [req-5dcd155f-da8d-4e40-b9b0-1ab5a1265c43 req-128b212a-3906-476b-bbcd-aa7d932f1bff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.645 187212 DEBUG oslo_concurrency.lockutils [req-5dcd155f-da8d-4e40-b9b0-1ab5a1265c43 req-128b212a-3906-476b-bbcd-aa7d932f1bff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.646 187212 DEBUG oslo_concurrency.lockutils [req-5dcd155f-da8d-4e40-b9b0-1ab5a1265c43 req-128b212a-3906-476b-bbcd-aa7d932f1bff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.646 187212 DEBUG oslo_concurrency.lockutils [req-5dcd155f-da8d-4e40-b9b0-1ab5a1265c43 req-128b212a-3906-476b-bbcd-aa7d932f1bff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.646 187212 DEBUG nova.compute.manager [req-5dcd155f-da8d-4e40-b9b0-1ab5a1265c43 req-128b212a-3906-476b-bbcd-aa7d932f1bff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] No waiting events found dispatching network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:03 np0005546909 nova_compute[187208]: 2025-12-05 12:07:03.646 187212 WARNING nova.compute.manager [req-5dcd155f-da8d-4e40-b9b0-1ab5a1265c43 req-128b212a-3906-476b-bbcd-aa7d932f1bff 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received unexpected event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.002 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.002 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.003 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.003 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.003 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.004 187212 INFO nova.compute.manager [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Terminating instance#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.005 187212 DEBUG nova.compute.manager [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:07:04 np0005546909 kernel: tap9357c6a6-eb (unregistering): left promiscuous mode
Dec  5 07:07:04 np0005546909 NetworkManager[55691]: <info>  [1764936424.0716] device (tap9357c6a6-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:07:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:04Z|00524|binding|INFO|Releasing lport 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 from this chassis (sb_readonly=0)
Dec  5 07:07:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:04Z|00525|binding|INFO|Setting lport 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 down in Southbound
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.078 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:04Z|00526|binding|INFO|Removing iface tap9357c6a6-eb ovn-installed in OVS
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.080 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.098 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.112 187212 INFO nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Snapshot image upload complete#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.112 187212 DEBUG nova.compute.manager [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:04 np0005546909 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000037.scope: Deactivated successfully.
Dec  5 07:07:04 np0005546909 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000037.scope: Consumed 15.414s CPU time.
Dec  5 07:07:04 np0005546909 systemd-machined[153543]: Machine qemu-59-instance-00000037 terminated.
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.272 187212 INFO nova.virt.libvirt.driver [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Instance destroyed successfully.#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.273 187212 DEBUG nova.objects.instance [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lazy-loading 'resources' on Instance uuid 472c7e2c-bdad-4230-904b-6937ceb872d2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.312 187212 INFO nova.network.neutron [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Port 8749491f-af83-499c-b823-14496cf1872d from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.312 187212 DEBUG nova.network.neutron [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [{"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.329 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:e8:08 10.100.0.14'], port_security=['fa:16:3e:08:e8:08 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '472c7e2c-bdad-4230-904b-6937ceb872d2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f4c4888-4b32-4259-8441-31af091e0c7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '85037de7275442698e604ee3f6283cbc', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed3fff5f-a24a-492e-ba85-8f010d446cfc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac2e7e6b-9342-46f8-a910-5de5a261f0a9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=9357c6a6-eb6f-4ab9-bfd6-486765004ac5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.331 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 9357c6a6-eb6f-4ab9-bfd6-486765004ac5 in datapath 0f4c4888-4b32-4259-8441-31af091e0c7d unbound from our chassis#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.333 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0f4c4888-4b32-4259-8441-31af091e0c7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.334 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[66f061fa-687e-457d-9196-8f9ceb221e5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.335 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d namespace which is not needed anymore#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.404 187212 DEBUG nova.virt.libvirt.vif [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-292918791',display_name='tempest-FloatingIPsAssociationTestJSON-server-292918791',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-292918791',id=55,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='85037de7275442698e604ee3f6283cbc',ramdisk_id='',reservation_id='r-c3vnhg04',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-883508882',owner_user_name='tempest-FloatingIPsAssociationTestJSON-883508882-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:05:46Z,user_data=None,user_id='8cf2534e7c394130b675e44ed567401b',uuid=472c7e2c-bdad-4230-904b-6937ceb872d2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.405 187212 DEBUG nova.network.os_vif_util [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converting VIF {"id": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "address": "fa:16:3e:08:e8:08", "network": {"id": "0f4c4888-4b32-4259-8441-31af091e0c7d", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-254966807-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "85037de7275442698e604ee3f6283cbc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9357c6a6-eb", "ovs_interfaceid": "9357c6a6-eb6f-4ab9-bfd6-486765004ac5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.405 187212 DEBUG nova.network.os_vif_util [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.405 187212 DEBUG os_vif [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.406 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.407 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9357c6a6-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.408 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.411 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.413 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.416 187212 INFO os_vif [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:08:e8:08,bridge_name='br-int',has_traffic_filtering=True,id=9357c6a6-eb6f-4ab9-bfd6-486765004ac5,network=Network(0f4c4888-4b32-4259-8441-31af091e0c7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9357c6a6-eb')#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.417 187212 INFO nova.virt.libvirt.driver [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Deleting instance files /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2_del#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.417 187212 INFO nova.virt.libvirt.driver [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Deletion of /var/lib/nova/instances/472c7e2c-bdad-4230-904b-6937ceb872d2_del complete#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.423 187212 DEBUG oslo_concurrency.lockutils [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-25918fc4-05ec-4a16-b77f-ca1d352a2763" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.448 187212 INFO nova.compute.manager [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Shelve offloading#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.455 187212 DEBUG oslo_concurrency.lockutils [None req-e440a6f1-3532-4560-a6a7-898e2a83e02c 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-25918fc4-05ec-4a16-b77f-ca1d352a2763-8749491f-af83-499c-b823-14496cf1872d" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.459 187212 INFO nova.virt.libvirt.driver [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance destroyed successfully.#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.459 187212 DEBUG nova.compute.manager [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:04 np0005546909 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [NOTICE]   (225592) : haproxy version is 2.8.14-c23fe91
Dec  5 07:07:04 np0005546909 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [NOTICE]   (225592) : path to executable is /usr/sbin/haproxy
Dec  5 07:07:04 np0005546909 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [WARNING]  (225592) : Exiting Master process...
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.462 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.462 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:04 np0005546909 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [ALERT]    (225592) : Current worker (225594) exited with code 143 (Terminated)
Dec  5 07:07:04 np0005546909 neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d[225588]: [WARNING]  (225592) : All workers exited. Exiting... (0)
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.463 187212 DEBUG nova.network.neutron [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:07:04 np0005546909 systemd[1]: libpod-96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302.scope: Deactivated successfully.
Dec  5 07:07:04 np0005546909 podman[227668]: 2025-12-05 12:07:04.472912417 +0000 UTC m=+0.050792755 container died 96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:07:04 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302-userdata-shm.mount: Deactivated successfully.
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.505 187212 INFO nova.compute.manager [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Took 0.50 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:07:04 np0005546909 systemd[1]: var-lib-containers-storage-overlay-346d0c910feecbfb16f9369239d1a7161a45d173e7171bca2bd39f251f209cca-merged.mount: Deactivated successfully.
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.506 187212 DEBUG oslo.service.loopingcall [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.506 187212 DEBUG nova.compute.manager [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.506 187212 DEBUG nova.network.neutron [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:07:04 np0005546909 podman[227668]: 2025-12-05 12:07:04.512418533 +0000 UTC m=+0.090298861 container cleanup 96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  5 07:07:04 np0005546909 systemd[1]: libpod-conmon-96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302.scope: Deactivated successfully.
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.569 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.570 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing instance network info cache due to event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.570 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.570 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.570 187212 DEBUG nova.network.neutron [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:07:04 np0005546909 podman[227697]: 2025-12-05 12:07:04.578699386 +0000 UTC m=+0.044302996 container remove 96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.584 104579 DEBUG eventlet.wsgi.server [-] (104579) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.584 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[608cd84d-a1cf-4b1c-86e9-7dbe60e28a1e]: (4, ('Fri Dec  5 12:07:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d (96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302)\n96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302\nFri Dec  5 12:07:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d (96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302)\n96f14ca8d4cdf07b9920a9d32ba147ebd12daf27298dd8fd4786cadf1369e302\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.586 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3cffab9e-e0d3-4f7a-8520-453010f459c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.586 104579 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: Accept: */*#015
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: Connection: close#015
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: Content-Type: text/plain#015
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: Host: 169.254.169.254#015
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: User-Agent: curl/7.84.0#015
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: X-Forwarded-For: 10.100.0.10#015
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: X-Ovn-Network-Id: 59233d66-44e6-47b3-b612-4f7d677af03d __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.588 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f4c4888-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.590 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:04 np0005546909 kernel: tap0f4c4888-40: left promiscuous mode
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.608 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.611 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0829f761-b8c4-465d-9cfe-34bfc240221d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.625 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[844062a8-623d-4dd2-b2ad-135db8e0f0f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.626 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2d25625d-8330-4df5-a8d0-85b7628869d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.644 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[178972a3-0ccb-475c-bc0f-ace6aafbf5a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372331, 'reachable_time': 38588, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227715, 'error': None, 'target': 'ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.646 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0f4c4888-4b32-4259-8441-31af091e0c7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.646 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[3647f202-51d0-46f3-bbc0-843a46dd176c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:04 np0005546909 systemd[1]: run-netns-ovnmeta\x2d0f4c4888\x2d4b32\x2d4259\x2d8441\x2d31af091e0c7d.mount: Deactivated successfully.
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.767 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.768 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.768 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.768 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.769 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.770 187212 INFO nova.compute.manager [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Terminating instance#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.771 187212 DEBUG nova.compute.manager [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:07:04 np0005546909 kernel: tap2064bfa7-12 (unregistering): left promiscuous mode
Dec  5 07:07:04 np0005546909 NetworkManager[55691]: <info>  [1764936424.7963] device (tap2064bfa7-12): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:07:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:04Z|00527|binding|INFO|Releasing lport 2064bfa7-125e-466c-9365-6c0ec6655113 from this chassis (sb_readonly=0)
Dec  5 07:07:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:04Z|00528|binding|INFO|Setting lport 2064bfa7-125e-466c-9365-6c0ec6655113 down in Southbound
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.807 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:04 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:04Z|00529|binding|INFO|Removing iface tap2064bfa7-12 ovn-installed in OVS
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.809 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.819 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.821 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:68:b7 10.100.0.12'], port_security=['fa:16:3e:7b:68:b7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '25918fc4-05ec-4a16-b77f-ca1d352a2763', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '38cb0acb-7ac3-4fef-baeb-661c59e2e07c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.234'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2064bfa7-125e-466c-9365-6c0ec6655113) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.822 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2064bfa7-125e-466c-9365-6c0ec6655113 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.824 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.825 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7b28e76a-93ab-4501-8ad0-6f9ad4fb4fd9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:04.826 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c namespace which is not needed anymore#033[00m
Dec  5 07:07:04 np0005546909 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000003c.scope: Deactivated successfully.
Dec  5 07:07:04 np0005546909 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000003c.scope: Consumed 13.920s CPU time.
Dec  5 07:07:04 np0005546909 systemd-machined[153543]: Machine qemu-64-instance-0000003c terminated.
Dec  5 07:07:04 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [NOTICE]   (226293) : haproxy version is 2.8.14-c23fe91
Dec  5 07:07:04 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [NOTICE]   (226293) : path to executable is /usr/sbin/haproxy
Dec  5 07:07:04 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [WARNING]  (226293) : Exiting Master process...
Dec  5 07:07:04 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [ALERT]    (226293) : Current worker (226296) exited with code 143 (Terminated)
Dec  5 07:07:04 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[226289]: [WARNING]  (226293) : All workers exited. Exiting... (0)
Dec  5 07:07:04 np0005546909 systemd[1]: libpod-b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934.scope: Deactivated successfully.
Dec  5 07:07:04 np0005546909 podman[227738]: 2025-12-05 12:07:04.950838115 +0000 UTC m=+0.044408879 container died b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:07:04 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934-userdata-shm.mount: Deactivated successfully.
Dec  5 07:07:04 np0005546909 systemd[1]: var-lib-containers-storage-overlay-39a6a101eec5b4484b9949c620581716165bc1e8dfc205f2cf46df83cc7fa1cb-merged.mount: Deactivated successfully.
Dec  5 07:07:04 np0005546909 podman[227738]: 2025-12-05 12:07:04.989315172 +0000 UTC m=+0.082885956 container cleanup b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:07:04 np0005546909 nova_compute[187208]: 2025-12-05 12:07:04.995 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:04 np0005546909 systemd[1]: libpod-conmon-b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934.scope: Deactivated successfully.
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.004 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.040 187212 INFO nova.virt.libvirt.driver [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Instance destroyed successfully.#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.042 187212 DEBUG nova.objects.instance [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'resources' on Instance uuid 25918fc4-05ec-4a16-b77f-ca1d352a2763 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.057 187212 DEBUG nova.virt.libvirt.vif [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1604830094',display_name='tempest-AttachInterfacesTestJSON-server-1604830094',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1604830094',id=60,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGfWejSHdN+jkWvFbUpP/3WQc/ML75ZJ8FQ3jOm1jHRfJUqUW+s+8nPpXgJlJ2MXiX/b4UD7bx2CcrRKwCdsWfcFUsiz+cn9CQ0ruzkboWFhGH59N2NddxlAthSxEhyWoQ==',key_name='tempest-keypair-1763466950',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-e5ux45ek',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=25918fc4-05ec-4a16-b77f-ca1d352a2763,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.058 187212 DEBUG nova.network.os_vif_util [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "2064bfa7-125e-466c-9365-6c0ec6655113", "address": "fa:16:3e:7b:68:b7", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2064bfa7-12", "ovs_interfaceid": "2064bfa7-125e-466c-9365-6c0ec6655113", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.058 187212 DEBUG nova.network.os_vif_util [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.059 187212 DEBUG os_vif [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.061 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.061 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2064bfa7-12, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:05 np0005546909 podman[227775]: 2025-12-05 12:07:05.063184635 +0000 UTC m=+0.049319172 container remove b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.114 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.117 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.119 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.119 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[71fef8b3-d4b8-49f8-8cc2-aef8478b55be]: (4, ('Fri Dec  5 12:07:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c (b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934)\nb18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934\nFri Dec  5 12:07:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c (b18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934)\nb18d4b570e69f18d1d0d95bd4c934041c73659a78012f3dcac0d78a1556bd934\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.121 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d10c6f82-e83b-453e-aa4b-9479c8ab2ea1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.122 187212 INFO os_vif [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:68:b7,bridge_name='br-int',has_traffic_filtering=True,id=2064bfa7-125e-466c-9365-6c0ec6655113,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2064bfa7-12')#033[00m
Dec  5 07:07:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.122 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.123 187212 INFO nova.virt.libvirt.driver [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Deleting instance files /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763_del#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.124 187212 INFO nova.virt.libvirt.driver [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Deletion of /var/lib/nova/instances/25918fc4-05ec-4a16-b77f-ca1d352a2763_del complete#033[00m
Dec  5 07:07:05 np0005546909 kernel: tapfbfed6fc-30: left promiscuous mode
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.128 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.142 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.145 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0547ba3e-7270-4f6c-960e-0431b69cce28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.160 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[05a117c3-d9d0-47f1-963c-80843eaefa7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.161 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[feb9c0e4-b3ec-4f44-9107-22a159ab8700]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.174 187212 INFO nova.compute.manager [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.175 187212 DEBUG oslo.service.loopingcall [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.175 187212 DEBUG nova.compute.manager [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:07:05 np0005546909 nova_compute[187208]: 2025-12-05 12:07:05.175 187212 DEBUG nova.network.neutron [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:07:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.177 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4909a4-b1a6-42b7-b927-6894cad3b2e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375474, 'reachable_time': 25020, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227801, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.179 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:07:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.180 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[ecd70753-53ac-4ac9-87e9-c88ec88d8409]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:05 np0005546909 systemd[1]: run-netns-ovnmeta\x2dfbfed6fc\x2d3701\x2d4311\x2da4c2\x2d8c49c5b7584c.mount: Deactivated successfully.
Dec  5 07:07:05 np0005546909 haproxy-metadata-proxy-59233d66-44e6-47b3-b612-4f7d677af03d[227143]: 10.100.0.10:41388 [05/Dec/2025:12:07:04.583] listener listener/metadata 0/0/0/1345/1345 200 1655 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Dec  5 07:07:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.928 104579 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Dec  5 07:07:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:05.928 104579 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 1671 time: 1.3425035#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.091 187212 DEBUG nova.network.neutron [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.111 187212 INFO nova.compute.manager [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Took 1.60 seconds to deallocate network for instance.#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.170 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.171 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.320 187212 DEBUG nova.network.neutron [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.401 187212 DEBUG nova.compute.provider_tree [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.537 187212 DEBUG nova.network.neutron [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updated VIF entry in instance network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.538 187212 DEBUG nova.network.neutron [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}, {"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.686 187212 DEBUG nova.scheduler.client.report [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.690 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.690 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-unplugged-8749491f-af83-499c-b823-14496cf1872d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.690 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.691 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.691 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.691 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-unplugged-8749491f-af83-499c-b823-14496cf1872d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.691 187212 WARNING nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-unplugged-8749491f-af83-499c-b823-14496cf1872d for instance with vm_state active and task_state None.#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.691 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.691 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.692 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.692 187212 DEBUG oslo_concurrency.lockutils [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.692 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.692 187212 WARNING nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-plugged-8749491f-af83-499c-b823-14496cf1872d for instance with vm_state active and task_state None.#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.692 187212 DEBUG nova.compute.manager [req-dc5ef9db-85dd-45ba-8b2e-4a31c609c8f2 req-ffa1e72d-f88c-4893-8c67-efa83fcb3607 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-deleted-8749491f-af83-499c-b823-14496cf1872d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.692 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.709 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.738 187212 INFO nova.scheduler.client.report [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Deleted allocations for instance 472c7e2c-bdad-4230-904b-6937ceb872d2#033[00m
Dec  5 07:07:06 np0005546909 nova_compute[187208]: 2025-12-05 12:07:06.813 187212 DEBUG oslo_concurrency.lockutils [None req-c5038764-3af8-49c9-b56f-3b4f09a7a6c6 8cf2534e7c394130b675e44ed567401b 85037de7275442698e604ee3f6283cbc - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.046 187212 DEBUG nova.objects.instance [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lazy-loading 'flavor' on Instance uuid bcdca3f9-3e24-4209-808c-8093b55e5c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.068 187212 DEBUG oslo_concurrency.lockutils [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.068 187212 DEBUG oslo_concurrency.lockutils [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.301 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.301 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.302 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.302 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.303 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.304 187212 INFO nova.compute.manager [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Terminating instance#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.305 187212 DEBUG nova.compute.manager [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:07:07 np0005546909 kernel: tapd10caa85-df (unregistering): left promiscuous mode
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.335 187212 DEBUG nova.network.neutron [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:07 np0005546909 NetworkManager[55691]: <info>  [1764936427.3371] device (tapd10caa85-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.342 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:07Z|00530|binding|INFO|Releasing lport d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 from this chassis (sb_readonly=0)
Dec  5 07:07:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:07Z|00531|binding|INFO|Setting lport d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 down in Southbound
Dec  5 07:07:07 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:07Z|00532|binding|INFO|Removing iface tapd10caa85-df ovn-installed in OVS
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.346 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:07.350 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:8d:e9 10.100.0.10'], port_security=['fa:16:3e:cc:8d:e9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ed00d159-9d70-481e-93be-ea180fea04ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59233d66-44e6-47b3-b612-4f7d677af03d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc1fd38e325f4a2caa75aeab79da75d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cb353a76-4787-4857-933e-e95743324e9e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f37497c0-7b03-4b0b-94d8-7ed5a2c705cb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:07.352 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 in datapath 59233d66-44e6-47b3-b612-4f7d677af03d unbound from our chassis#033[00m
Dec  5 07:07:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:07.354 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 59233d66-44e6-47b3-b612-4f7d677af03d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.354 187212 DEBUG nova.compute.manager [req-46859646-168e-4d33-b072-3ed9aba301ff req-3bc0f9e2-9dc1-4e2c-a6f2-9815a75e122e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-vif-deleted-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.356 187212 INFO nova.compute.manager [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Took 2.18 seconds to deallocate network for instance.#033[00m
Dec  5 07:07:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:07.359 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f5846837-b84e-452d-985b-215c52384045]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:07.360 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d namespace which is not needed anymore#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.364 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:07 np0005546909 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000040.scope: Deactivated successfully.
Dec  5 07:07:07 np0005546909 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000040.scope: Consumed 14.810s CPU time.
Dec  5 07:07:07 np0005546909 systemd-machined[153543]: Machine qemu-68-instance-00000040 terminated.
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.409 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.410 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:07 np0005546909 podman[227805]: 2025-12-05 12:07:07.427302136 +0000 UTC m=+0.059257011 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.529 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.541 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.573 187212 INFO nova.virt.libvirt.driver [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Instance destroyed successfully.#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.573 187212 DEBUG nova.objects.instance [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lazy-loading 'resources' on Instance uuid ed00d159-9d70-481e-93be-ea180fea04ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.588 187212 DEBUG nova.virt.libvirt.vif [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=64,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBATHw79fzCFS1LAWHUiavQB3gUFaXpS81QU/Ce6wZ4HmvTj5LBGoan0DqDckMccItIq/MaTr8w95EnUae9L4Bz4KldjVTS0oi0uLUNfFAJiLjBukcvGPiZbx9R9d1EWHww==',key_name='tempest-keypair-599091465',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dc1fd38e325f4a2caa75aeab79da75d3',ramdisk_id='',reservation_id='r-cei648o9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersV294TestFqdnHostnames-303309807',owner_user_name='tempest-ServersV294TestFqdnHostnames-303309807-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='077bcce844cb42a197dcd6100549b7d3',uuid=ed00d159-9d70-481e-93be-ea180fea04ba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.588 187212 DEBUG nova.network.os_vif_util [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Converting VIF {"id": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "address": "fa:16:3e:cc:8d:e9", "network": {"id": "59233d66-44e6-47b3-b612-4f7d677af03d", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-2087772180-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "dc1fd38e325f4a2caa75aeab79da75d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd10caa85-df", "ovs_interfaceid": "d10caa85-dfcd-49ce-8ff7-2c2a68d1d731", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.589 187212 DEBUG nova.network.os_vif_util [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.589 187212 DEBUG os_vif [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.591 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.591 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd10caa85-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.593 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.595 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.596 187212 DEBUG nova.compute.provider_tree [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.600 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.603 187212 INFO os_vif [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:8d:e9,bridge_name='br-int',has_traffic_filtering=True,id=d10caa85-dfcd-49ce-8ff7-2c2a68d1d731,network=Network(59233d66-44e6-47b3-b612-4f7d677af03d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd10caa85-df')#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.603 187212 INFO nova.virt.libvirt.driver [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Deleting instance files /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba_del#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.604 187212 INFO nova.virt.libvirt.driver [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Deletion of /var/lib/nova/instances/ed00d159-9d70-481e-93be-ea180fea04ba_del complete#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.614 187212 DEBUG nova.scheduler.client.report [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.643 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.656 187212 INFO nova.compute.manager [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.656 187212 DEBUG oslo.service.loopingcall [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.657 187212 DEBUG nova.compute.manager [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.657 187212 DEBUG nova.network.neutron [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.666 187212 INFO nova.scheduler.client.report [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Deleted allocations for instance 25918fc4-05ec-4a16-b77f-ca1d352a2763#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.749 187212 DEBUG oslo_concurrency.lockutils [None req-ce34b95d-6b44-42ae-8f51-792e0d394889 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.981s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:07 np0005546909 nova_compute[187208]: 2025-12-05 12:07:07.750 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:07 np0005546909 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [NOTICE]   (227141) : haproxy version is 2.8.14-c23fe91
Dec  5 07:07:07 np0005546909 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [NOTICE]   (227141) : path to executable is /usr/sbin/haproxy
Dec  5 07:07:07 np0005546909 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [WARNING]  (227141) : Exiting Master process...
Dec  5 07:07:07 np0005546909 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [WARNING]  (227141) : Exiting Master process...
Dec  5 07:07:07 np0005546909 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [ALERT]    (227141) : Current worker (227143) exited with code 143 (Terminated)
Dec  5 07:07:07 np0005546909 neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d[227137]: [WARNING]  (227141) : All workers exited. Exiting... (0)
Dec  5 07:07:07 np0005546909 systemd[1]: libpod-44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b.scope: Deactivated successfully.
Dec  5 07:07:07 np0005546909 podman[227846]: 2025-12-05 12:07:07.82274168 +0000 UTC m=+0.374429566 container died 44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Dec  5 07:07:08 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b-userdata-shm.mount: Deactivated successfully.
Dec  5 07:07:08 np0005546909 systemd[1]: var-lib-containers-storage-overlay-a0d77ec8b1cc024940e91355c3fecada2e5d7bf69ad2fc36a67c677303a54e3e-merged.mount: Deactivated successfully.
Dec  5 07:07:08 np0005546909 podman[227846]: 2025-12-05 12:07:08.047829272 +0000 UTC m=+0.599517158 container cleanup 44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:07:08 np0005546909 systemd[1]: libpod-conmon-44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b.scope: Deactivated successfully.
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.149 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-vif-unplugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.149 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.149 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.149 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.150 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] No waiting events found dispatching network-vif-unplugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.150 187212 WARNING nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received unexpected event network-vif-unplugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.150 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.150 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.151 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.151 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "472c7e2c-bdad-4230-904b-6937ceb872d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.151 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] No waiting events found dispatching network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.151 187212 WARNING nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Received unexpected event network-vif-plugged-9357c6a6-eb6f-4ab9-bfd6-486765004ac5 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.151 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-unplugged-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.152 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.152 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.152 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.152 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-unplugged-2064bfa7-125e-466c-9365-6c0ec6655113 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.152 187212 WARNING nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-unplugged-2064bfa7-125e-466c-9365-6c0ec6655113 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.153 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.153 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.153 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.153 187212 DEBUG oslo_concurrency.lockutils [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "25918fc4-05ec-4a16-b77f-ca1d352a2763-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.153 187212 DEBUG nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] No waiting events found dispatching network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.154 187212 WARNING nova.compute.manager [req-31a2de92-0c13-4d39-90e9-7f40bc80bbe4 req-df6f1638-f5f6-4a99-b407-f4be8a2dea55 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received unexpected event network-vif-plugged-2064bfa7-125e-466c-9365-6c0ec6655113 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:07:08 np0005546909 podman[227892]: 2025-12-05 12:07:08.16319949 +0000 UTC m=+0.093907486 container remove 44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 07:07:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.168 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1b9fd42b-07cd-4ff1-9a28-886cff9d7ce4]: (4, ('Fri Dec  5 12:07:07 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d (44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b)\n44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b\nFri Dec  5 12:07:08 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d (44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b)\n44a580194440cb57a62d9721a458523c8d584bbf00689778c1dcf32cb1ed9d8b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.170 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a8fd98-4178-458e-92e1-0ab93697e94b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.171 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59233d66-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.173 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:08 np0005546909 kernel: tap59233d66-40: left promiscuous mode
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.184 187212 INFO nova.virt.libvirt.driver [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance destroyed successfully.#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.185 187212 DEBUG nova.objects.instance [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'resources' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.189 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.192 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[33118861-e5c2-4478-8812-5aed35775211]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.201 187212 DEBUG nova.virt.libvirt.vif [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-795100487',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-795100487',id=62,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHKhL003clvQeWhyQnRnlaccZLUvEBLEhvImBOCB5geqDizgWJsGjayma/8q9qGL/NiGPTPxEZoxanWZnFRBuZklxJy5hDaSwVjbF4FtdnX9ysLeFgNsQAX0H4LK24ei2Q==',key_name='tempest-keypair-105541899',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='6d62df5807554f499d26b5fc77ec8603',ramdisk_id='',reservation_id='r-zgvbze4r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-1858452545',owner_user_name='tempest-AttachVolumeShelveTestJSON-1858452545-project-member',shelved_at='2025-12-05T12:07:04.112687',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='13b862b8-8b0a-448a-bbba-7d8ef455d2c6'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc4332be3b424a5e996b61b244505cfc',uuid=5d70ac2d-111f-4e1b-ac26-3e02849b0458,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.202 187212 DEBUG nova.network.os_vif_util [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converting VIF {"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.202 187212 DEBUG nova.network.os_vif_util [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.203 187212 DEBUG os_vif [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.204 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.204 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac02dd63-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.206 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.207 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.209 187212 INFO os_vif [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a')#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.210 187212 INFO nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Deleting instance files /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458_del#033[00m
Dec  5 07:07:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.210 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3f4ea93c-87e7-485f-8181-0159d87781c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.211 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[726cc51f-1868-4c2c-8084-7ce21434a083]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.216 187212 INFO nova.virt.libvirt.driver [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Deletion of /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458_del complete#033[00m
Dec  5 07:07:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.225 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8d679fd2-9c29-4901-a2f6-0e1ca75b871e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377979, 'reachable_time': 23454, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227912, 'error': None, 'target': 'ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.229 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-59233d66-44e6-47b3-b612-4f7d677af03d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:07:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:08.229 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[ddb070e6-c58f-4e09-802e-b73b3362caa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:08 np0005546909 systemd[1]: run-netns-ovnmeta\x2d59233d66\x2d44e6\x2d47b3\x2db612\x2d4f7d677af03d.mount: Deactivated successfully.
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.303 187212 DEBUG nova.network.neutron [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.827 187212 INFO nova.scheduler.client.report [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Deleted allocations for instance 5d70ac2d-111f-4e1b-ac26-3e02849b0458#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.963 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:08 np0005546909 nova_compute[187208]: 2025-12-05 12:07:08.963 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.127 187212 DEBUG nova.compute.provider_tree [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.297 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.298 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.304 187212 DEBUG nova.scheduler.client.report [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.336 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.357 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.532 187212 DEBUG oslo_concurrency.lockutils [None req-7ac90414-afd9-4024-8f55-ecd8b328356e bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 12.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.615 187212 DEBUG nova.network.neutron [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.638 187212 INFO nova.compute.manager [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Took 1.98 seconds to deallocate network for instance.#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.644 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.644 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.653 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.653 187212 INFO nova.compute.claims [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.717 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.907 187212 DEBUG nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Received event network-vif-deleted-2064bfa7-125e-466c-9365-6c0ec6655113 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.908 187212 DEBUG nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-vif-unplugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.908 187212 DEBUG oslo_concurrency.lockutils [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.908 187212 DEBUG oslo_concurrency.lockutils [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.908 187212 DEBUG oslo_concurrency.lockutils [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.909 187212 DEBUG nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] No waiting events found dispatching network-vif-unplugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.909 187212 WARNING nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received unexpected event network-vif-unplugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.909 187212 DEBUG nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.909 187212 DEBUG oslo_concurrency.lockutils [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.909 187212 DEBUG oslo_concurrency.lockutils [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.910 187212 DEBUG oslo_concurrency.lockutils [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.910 187212 DEBUG nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] No waiting events found dispatching network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.910 187212 WARNING nova.compute.manager [req-1449bdd5-435c-4e89-aada-bb93719243bd req-b0c207d9-4e77-4908-9d62-ac90afbfa4ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received unexpected event network-vif-plugged-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.963 187212 DEBUG nova.compute.provider_tree [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:09 np0005546909 nova_compute[187208]: 2025-12-05 12:07:09.986 187212 DEBUG nova.scheduler.client.report [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.007 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.362s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.007 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.010 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.047 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.048 187212 DEBUG nova.network.neutron [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.065 187212 INFO nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.082 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.177 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.178 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.178 187212 INFO nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Creating image(s)#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.179 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "/var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.179 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.180 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.194 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.228 187212 DEBUG nova.compute.provider_tree [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.231 187212 DEBUG nova.network.neutron [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.253 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.253 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.254 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.254 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.254 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.256 187212 INFO nova.compute.manager [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Terminating instance#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.257 187212 DEBUG nova.compute.manager [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.259 187212 DEBUG nova.scheduler.client.report [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.264 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.265 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.265 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.279 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.303 187212 DEBUG oslo_concurrency.lockutils [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.304 187212 DEBUG nova.compute.manager [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.304 187212 DEBUG nova.compute.manager [None req-94ff800a-c3c6-49c1-a68e-d8908fd675c0 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] network_info to inject: |[{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.308 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.318 187212 DEBUG nova.compute.manager [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.318 187212 DEBUG nova.compute.manager [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing instance network info cache due to event network-changed-88c7b630-e84b-4a35-8c8f-f934e7cabaf6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.319 187212 DEBUG oslo_concurrency.lockutils [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.319 187212 DEBUG oslo_concurrency.lockutils [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.319 187212 DEBUG nova.network.neutron [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Refreshing network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.338 187212 INFO nova.scheduler.client.report [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Deleted allocations for instance ed00d159-9d70-481e-93be-ea180fea04ba#033[00m
Dec  5 07:07:10 np0005546909 kernel: tap5683f8a8-69 (unregistering): left promiscuous mode
Dec  5 07:07:10 np0005546909 NetworkManager[55691]: <info>  [1764936430.3460] device (tap5683f8a8-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.348 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.349 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:10Z|00533|binding|INFO|Releasing lport 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 from this chassis (sb_readonly=0)
Dec  5 07:07:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:10Z|00534|binding|INFO|Setting lport 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 down in Southbound
Dec  5 07:07:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:10Z|00535|binding|INFO|Removing iface tap5683f8a8-69 ovn-installed in OVS
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.365 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:3c:38 10.100.0.11'], port_security=['fa:16:3e:d3:3c:38 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b81bb939-d14f-4a72-b7fe-95fc5d8810a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.366 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 unbound from our chassis#033[00m
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.371 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.375 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.385 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[715a1027-ca91-4efc-8ff9-b6458a79998e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:10 np0005546909 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000003a.scope: Deactivated successfully.
Dec  5 07:07:10 np0005546909 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000003a.scope: Consumed 14.513s CPU time.
Dec  5 07:07:10 np0005546909 systemd-machined[153543]: Machine qemu-62-instance-0000003a terminated.
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.415 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[24ec07fd-35a4-4823-9aa8-1f87b3e5de69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.419 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7b579f1d-2328-4499-9983-ad557e5117c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.440 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk 1073741824" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.441 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.442 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.450 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c1e7480a-7454-429b-84c3-bb69f957b182]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.466 187212 DEBUG nova.policy [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.470 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8248c1a8-e3f3-4f23-8207-d1d4bdfcc6cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 17, 'rx_bytes': 868, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 17, 'rx_bytes': 868, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227935, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.471 187212 DEBUG oslo_concurrency.lockutils [None req-8e32f0c4-335a-451a-819d-fc7952b6ac32 077bcce844cb42a197dcd6100549b7d3 dc1fd38e325f4a2caa75aeab79da75d3 - - default default] Lock "ed00d159-9d70-481e-93be-ea180fea04ba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.487 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[09835604-c3b0-4f3e-8dc0-e61b21f4ca62]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227936, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227936, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.489 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.490 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.495 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.495 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.495 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.495 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:10.495 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.501 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.502 187212 DEBUG nova.virt.disk.api [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Checking if we can resize image /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.502 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.541 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.549 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.572 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.573 187212 DEBUG nova.virt.disk.api [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Cannot resize image /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.573 187212 DEBUG nova.objects.instance [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'migration_context' on Instance uuid 854e3893-3908-4b4a-b29c-7fb4384e4f0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.591 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.592 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Ensure instance console log exists: /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.593 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.594 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.594 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.597 187212 INFO nova.virt.libvirt.driver [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Instance destroyed successfully.#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.598 187212 DEBUG nova.objects.instance [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'resources' on Instance uuid b81bb939-d14f-4a72-b7fe-95fc5d8810a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.611 187212 DEBUG nova.virt.libvirt.vif [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1462907521',display_name='tempest-ListServerFiltersTestJSON-instance-1462907521',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1462907521',id=58,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-bzpoia2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:01Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=b81bb939-d14f-4a72-b7fe-95fc5d8810a1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.612 187212 DEBUG nova.network.os_vif_util [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "address": "fa:16:3e:d3:3c:38", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5683f8a8-69", "ovs_interfaceid": "5683f8a8-691c-43f3-a88f-eb0c30ccb3c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.613 187212 DEBUG nova.network.os_vif_util [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.613 187212 DEBUG os_vif [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.618 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.619 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5683f8a8-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.620 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.621 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.629 187212 INFO os_vif [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:3c:38,bridge_name='br-int',has_traffic_filtering=True,id=5683f8a8-691c-43f3-a88f-eb0c30ccb3c5,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5683f8a8-69')#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.629 187212 INFO nova.virt.libvirt.driver [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Deleting instance files /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1_del#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.630 187212 INFO nova.virt.libvirt.driver [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Deletion of /var/lib/nova/instances/b81bb939-d14f-4a72-b7fe-95fc5d8810a1_del complete#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.688 187212 INFO nova.compute.manager [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.688 187212 DEBUG oslo.service.loopingcall [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.689 187212 DEBUG nova.compute.manager [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:07:10 np0005546909 nova_compute[187208]: 2025-12-05 12:07:10.689 187212 DEBUG nova.network.neutron [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:07:11 np0005546909 nova_compute[187208]: 2025-12-05 12:07:11.814 187212 DEBUG nova.network.neutron [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:11 np0005546909 nova_compute[187208]: 2025-12-05 12:07:11.842 187212 INFO nova.compute.manager [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Took 1.15 seconds to deallocate network for instance.#033[00m
Dec  5 07:07:11 np0005546909 nova_compute[187208]: 2025-12-05 12:07:11.900 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:11 np0005546909 nova_compute[187208]: 2025-12-05 12:07:11.901 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:11 np0005546909 nova_compute[187208]: 2025-12-05 12:07:11.945 187212 DEBUG nova.network.neutron [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Successfully created port: 1b4ab157-ddea-449c-ab91-983a53dd2045 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:07:12 np0005546909 nova_compute[187208]: 2025-12-05 12:07:12.044 187212 DEBUG nova.network.neutron [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updated VIF entry in instance network info cache for port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:07:12 np0005546909 nova_compute[187208]: 2025-12-05 12:07:12.045 187212 DEBUG nova.network.neutron [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [{"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:12 np0005546909 nova_compute[187208]: 2025-12-05 12:07:12.061 187212 DEBUG oslo_concurrency.lockutils [req-077fbd6a-5c4f-4bbf-ac6d-87bc60c2bfce req-604b4b37-6030-414c-87e3-f692a64f7284 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-bcdca3f9-3e24-4209-808c-8093b55e5c2d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:12 np0005546909 nova_compute[187208]: 2025-12-05 12:07:12.129 187212 DEBUG nova.compute.provider_tree [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:12 np0005546909 nova_compute[187208]: 2025-12-05 12:07:12.148 187212 DEBUG nova.scheduler.client.report [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:12 np0005546909 nova_compute[187208]: 2025-12-05 12:07:12.171 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:12 np0005546909 nova_compute[187208]: 2025-12-05 12:07:12.193 187212 INFO nova.scheduler.client.report [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Deleted allocations for instance b81bb939-d14f-4a72-b7fe-95fc5d8810a1#033[00m
Dec  5 07:07:12 np0005546909 nova_compute[187208]: 2025-12-05 12:07:12.265 187212 DEBUG oslo_concurrency.lockutils [None req-dcc97212-f6ae-4196-88cc-a5af1ba9b2f7 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:12 np0005546909 nova_compute[187208]: 2025-12-05 12:07:12.753 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.056 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.056 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.056 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.057 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.057 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.058 187212 INFO nova.compute.manager [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Terminating instance#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.059 187212 DEBUG nova.compute.manager [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:07:13 np0005546909 kernel: tap88c7b630-e8 (unregistering): left promiscuous mode
Dec  5 07:07:13 np0005546909 NetworkManager[55691]: <info>  [1764936433.0969] device (tap88c7b630-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.105 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:13Z|00536|binding|INFO|Releasing lport 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 from this chassis (sb_readonly=0)
Dec  5 07:07:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:13Z|00537|binding|INFO|Setting lport 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 down in Southbound
Dec  5 07:07:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:13Z|00538|binding|INFO|Removing iface tap88c7b630-e8 ovn-installed in OVS
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.107 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.117 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:13 np0005546909 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Dec  5 07:07:13 np0005546909 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d0000003d.scope: Consumed 14.458s CPU time.
Dec  5 07:07:13 np0005546909 systemd-machined[153543]: Machine qemu-66-instance-0000003d terminated.
Dec  5 07:07:13 np0005546909 NetworkManager[55691]: <info>  [1764936433.2772] manager: (tap88c7b630-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/224)
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.294 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:19:b7 10.100.0.7'], port_security=['fa:16:3e:bb:19:b7 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'bcdca3f9-3e24-4209-808c-8093b55e5c2d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0566af06-3837-49db-a95c-47b9857e4e90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5285f99befb24ac285be8e4fc1d18e69', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'a5c5fedc-8874-4d17-85d6-f832393ee546', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b689627-4043-49f3-b45a-0160a35a0a18, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=88c7b630-e84b-4a35-8c8f-f934e7cabaf6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.296 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 88c7b630-e84b-4a35-8c8f-f934e7cabaf6 in datapath 0566af06-3837-49db-a95c-47b9857e4e90 unbound from our chassis#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.297 187212 DEBUG nova.compute.manager [req-192e953a-e699-4f96-8dcf-41dfc6b9c93e req-527f23e1-a804-4feb-97c0-9e590bc0c0f3 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Received event network-vif-deleted-d10caa85-dfcd-49ce-8ff7-2c2a68d1d731 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.299 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0566af06-3837-49db-a95c-47b9857e4e90, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.300 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c820a59b-ea80-4f11-9684-60337430cf21]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.300 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90 namespace which is not needed anymore#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.325 187212 INFO nova.virt.libvirt.driver [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Instance destroyed successfully.#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.326 187212 DEBUG nova.objects.instance [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lazy-loading 'resources' on Instance uuid bcdca3f9-3e24-4209-808c-8093b55e5c2d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:13 np0005546909 podman[227974]: 2025-12-05 12:07:13.378615302 +0000 UTC m=+0.051149285 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.465 187212 DEBUG nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received event network-vif-unplugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.466 187212 DEBUG oslo_concurrency.lockutils [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.466 187212 DEBUG oslo_concurrency.lockutils [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.466 187212 DEBUG oslo_concurrency.lockutils [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.466 187212 DEBUG nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] No waiting events found dispatching network-vif-unplugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.467 187212 WARNING nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received unexpected event network-vif-unplugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.467 187212 DEBUG nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.467 187212 DEBUG oslo_concurrency.lockutils [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.467 187212 DEBUG oslo_concurrency.lockutils [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.467 187212 DEBUG oslo_concurrency.lockutils [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b81bb939-d14f-4a72-b7fe-95fc5d8810a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.468 187212 DEBUG nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] No waiting events found dispatching network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.468 187212 WARNING nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received unexpected event network-vif-plugged-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.468 187212 DEBUG nova.compute.manager [req-fbb5b35e-bb7a-4a34-a851-206665d1759a req-73177577-d4a6-4867-b569-6bf87d4af2b5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Received event network-vif-deleted-5683f8a8-691c-43f3-a88f-eb0c30ccb3c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.473 187212 DEBUG nova.virt.libvirt.vif [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-2105634627',display_name='tempest-AttachInterfacesUnderV243Test-server-2105634627',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-2105634627',id=61,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNE7kQOo1iw7msO5U3UKQiYNUNOuR3489N27cA8/7AyK9hUMINDB4EKPtuAqKWiOpLa6/9d1/JcrFvBfelk3gje2Ue6XSif/X6uD8HtKgekiyZF9ENjW4HKYytyiU96vgQ==',key_name='tempest-keypair-865071651',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5285f99befb24ac285be8e4fc1d18e69',ramdisk_id='',reservation_id='r-93zclce8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesUnderV243Test-1358924829',owner_user_name='tempest-AttachInterfacesUnderV243Test-1358924829-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:10Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6b73160d333a43ed94d4258262e3c2b5',uuid=bcdca3f9-3e24-4209-808c-8093b55e5c2d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.473 187212 DEBUG nova.network.os_vif_util [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Converting VIF {"id": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "address": "fa:16:3e:bb:19:b7", "network": {"id": "0566af06-3837-49db-a95c-47b9857e4e90", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-2120321794-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5285f99befb24ac285be8e4fc1d18e69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88c7b630-e8", "ovs_interfaceid": "88c7b630-e84b-4a35-8c8f-f934e7cabaf6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.474 187212 DEBUG nova.network.os_vif_util [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.474 187212 DEBUG os_vif [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.475 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.475 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88c7b630-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.512 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.517 187212 INFO os_vif [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bb:19:b7,bridge_name='br-int',has_traffic_filtering=True,id=88c7b630-e84b-4a35-8c8f-f934e7cabaf6,network=Network(0566af06-3837-49db-a95c-47b9857e4e90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88c7b630-e8')#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.517 187212 INFO nova.virt.libvirt.driver [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Deleting instance files /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d_del#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.518 187212 INFO nova.virt.libvirt.driver [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Deletion of /var/lib/nova/instances/bcdca3f9-3e24-4209-808c-8093b55e5c2d_del complete#033[00m
Dec  5 07:07:13 np0005546909 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [NOTICE]   (226578) : haproxy version is 2.8.14-c23fe91
Dec  5 07:07:13 np0005546909 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [NOTICE]   (226578) : path to executable is /usr/sbin/haproxy
Dec  5 07:07:13 np0005546909 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [WARNING]  (226578) : Exiting Master process...
Dec  5 07:07:13 np0005546909 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [ALERT]    (226578) : Current worker (226580) exited with code 143 (Terminated)
Dec  5 07:07:13 np0005546909 neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90[226574]: [WARNING]  (226578) : All workers exited. Exiting... (0)
Dec  5 07:07:13 np0005546909 systemd[1]: libpod-abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4.scope: Deactivated successfully.
Dec  5 07:07:13 np0005546909 podman[228010]: 2025-12-05 12:07:13.550706956 +0000 UTC m=+0.168809190 container died abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:07:13 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4-userdata-shm.mount: Deactivated successfully.
Dec  5 07:07:13 np0005546909 systemd[1]: var-lib-containers-storage-overlay-0b07d63045405ef887718f7278761beb53f95ff32a7c0a77fd47c0591ae50b1a-merged.mount: Deactivated successfully.
Dec  5 07:07:13 np0005546909 podman[228010]: 2025-12-05 12:07:13.590332056 +0000 UTC m=+0.208434260 container cleanup abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:07:13 np0005546909 systemd[1]: libpod-conmon-abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4.scope: Deactivated successfully.
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.651 187212 INFO nova.compute.manager [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Took 0.59 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.651 187212 DEBUG oslo.service.loopingcall [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.651 187212 DEBUG nova.compute.manager [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.652 187212 DEBUG nova.network.neutron [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:07:13 np0005546909 podman[228045]: 2025-12-05 12:07:13.689970747 +0000 UTC m=+0.073609787 container remove abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.694 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0fad9c7e-62c2-40ed-b364-cdec4c04b767]: (4, ('Fri Dec  5 12:07:13 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90 (abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4)\nabb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4\nFri Dec  5 12:07:13 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90 (abb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4)\nabb18fa19af6b776412a8a82157e2f4b94b9a03855787999ba1fafa3907735f4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.696 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4f39ada3-eba3-4f8d-8523-03c8675d16ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.697 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0566af06-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:13 np0005546909 kernel: tap0566af06-30: left promiscuous mode
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.698 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:13 np0005546909 nova_compute[187208]: 2025-12-05 12:07:13.716 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.720 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[11342a60-1c2c-4df6-ae9d-f90cc1e186c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.740 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[11a1e2e9-ed47-4618-92d6-1083b9bdb7f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.742 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c5955714-e525-450b-8a53-a919179351fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.757 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7b34286c-4f3f-444d-b63b-f793b2e0d080]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375995, 'reachable_time': 24710, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228059, 'error': None, 'target': 'ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.759 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0566af06-3837-49db-a95c-47b9857e4e90 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:07:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:13.760 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[d4bf975d-8e8b-43dd-b47f-3b8d8e59d527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:13 np0005546909 systemd[1]: run-netns-ovnmeta\x2d0566af06\x2d3837\x2d49db\x2da95c\x2d47b9857e4e90.mount: Deactivated successfully.
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.355 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936419.3544672, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.355 187212 INFO nova.compute.manager [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.371 187212 DEBUG nova.compute.manager [None req-fe8ac904-2f43-41ff-bac4-9cb943b64825 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.578 187212 DEBUG nova.network.neutron [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Successfully updated port: 1b4ab157-ddea-449c-ab91-983a53dd2045 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.593 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "refresh_cache-854e3893-3908-4b4a-b29c-7fb4384e4f0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.593 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquired lock "refresh_cache-854e3893-3908-4b4a-b29c-7fb4384e4f0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.594 187212 DEBUG nova.network.neutron [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.732 187212 DEBUG nova.network.neutron [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.741 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.741 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.741 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.742 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.742 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.743 187212 INFO nova.compute.manager [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Terminating instance#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.744 187212 DEBUG nova.compute.manager [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:07:14 np0005546909 kernel: tapc5cb68aa-e5 (unregistering): left promiscuous mode
Dec  5 07:07:14 np0005546909 NetworkManager[55691]: <info>  [1764936434.7777] device (tapc5cb68aa-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:07:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:14Z|00539|binding|INFO|Releasing lport c5cb68aa-e5c2-48b0-b9c4-e0542120e065 from this chassis (sb_readonly=0)
Dec  5 07:07:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:14Z|00540|binding|INFO|Setting lport c5cb68aa-e5c2-48b0-b9c4-e0542120e065 down in Southbound
Dec  5 07:07:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:14Z|00541|binding|INFO|Removing iface tapc5cb68aa-e5 ovn-installed in OVS
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.786 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.793 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:a8:16 10.100.0.13'], port_security=['fa:16:3e:8a:a8:16 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8888dd78-1c78-4065-8536-9a1096bdf57b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c5cb68aa-e5c2-48b0-b9c4-e0542120e065) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.794 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c5cb68aa-e5c2-48b0-b9c4-e0542120e065 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 unbound from our chassis#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.795 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.798 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.811 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b72444c2-a1bf-4edf-91e7-4f53d1227ea2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:14 np0005546909 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000039.scope: Deactivated successfully.
Dec  5 07:07:14 np0005546909 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000039.scope: Consumed 16.217s CPU time.
Dec  5 07:07:14 np0005546909 systemd-machined[153543]: Machine qemu-61-instance-00000039 terminated.
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.846 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[24d8407d-e380-404c-a5be-fa4aa5f20899]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.850 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[96e1416e-e243-439a-ad8d-fdc214586e63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.877 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[da3f4408-35e5-478b-bcc8-326390c30112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.894 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35ff0772-998b-4f36-8494-c050c123063d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a2d11fe-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:94:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 19, 'rx_bytes': 868, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 19, 'rx_bytes': 868, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 128], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372590, 'reachable_time': 40700, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228073, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.910 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7824da0b-7539-4102-a607-ffa0aca9cba8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372603, 'tstamp': 372603}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228074, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4a2d11fe-a1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372605, 'tstamp': 372605}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228074, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.912 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.914 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:14 np0005546909 nova_compute[187208]: 2025-12-05 12:07:14.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.918 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a2d11fe-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.918 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.919 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a2d11fe-a0, col_values=(('external_ids', {'iface-id': '27f6a3c0-dd69-4255-8d00-850605f3016e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:14.919 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.011 187212 INFO nova.virt.libvirt.driver [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Instance destroyed successfully.#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.011 187212 DEBUG nova.objects.instance [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'resources' on Instance uuid 8888dd78-1c78-4065-8536-9a1096bdf57b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.029 187212 DEBUG nova.virt.libvirt.vif [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-2001854085',display_name='tempest-ListServerFiltersTestJSON-instance-2001854085',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-2001854085',id=57,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-ubyu8olf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:05:51Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=8888dd78-1c78-4065-8536-9a1096bdf57b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.029 187212 DEBUG nova.network.os_vif_util [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "address": "fa:16:3e:8a:a8:16", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5cb68aa-e5", "ovs_interfaceid": "c5cb68aa-e5c2-48b0-b9c4-e0542120e065", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.030 187212 DEBUG nova.network.os_vif_util [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.030 187212 DEBUG os_vif [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.032 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.032 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5cb68aa-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.033 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.035 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.036 187212 INFO os_vif [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:a8:16,bridge_name='br-int',has_traffic_filtering=True,id=c5cb68aa-e5c2-48b0-b9c4-e0542120e065,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5cb68aa-e5')#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.037 187212 INFO nova.virt.libvirt.driver [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Deleting instance files /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b_del#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.038 187212 INFO nova.virt.libvirt.driver [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Deletion of /var/lib/nova/instances/8888dd78-1c78-4065-8536-9a1096bdf57b_del complete#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.087 187212 INFO nova.compute.manager [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.088 187212 DEBUG oslo.service.loopingcall [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.088 187212 DEBUG nova.compute.manager [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.088 187212 DEBUG nova.network.neutron [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.936 187212 DEBUG nova.compute.manager [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-changed-1b4ab157-ddea-449c-ab91-983a53dd2045 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.936 187212 DEBUG nova.compute.manager [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Refreshing instance network info cache due to event network-changed-1b4ab157-ddea-449c-ab91-983a53dd2045. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:07:15 np0005546909 nova_compute[187208]: 2025-12-05 12:07:15.937 187212 DEBUG oslo_concurrency.lockutils [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-854e3893-3908-4b4a-b29c-7fb4384e4f0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.031 187212 DEBUG nova.compute.manager [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-vif-unplugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.032 187212 DEBUG oslo_concurrency.lockutils [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.032 187212 DEBUG oslo_concurrency.lockutils [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.033 187212 DEBUG oslo_concurrency.lockutils [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.033 187212 DEBUG nova.compute.manager [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] No waiting events found dispatching network-vif-unplugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.033 187212 DEBUG nova.compute.manager [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-vif-unplugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.033 187212 DEBUG nova.compute.manager [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.034 187212 DEBUG oslo_concurrency.lockutils [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.034 187212 DEBUG oslo_concurrency.lockutils [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.034 187212 DEBUG oslo_concurrency.lockutils [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.035 187212 DEBUG nova.compute.manager [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] No waiting events found dispatching network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.035 187212 WARNING nova.compute.manager [req-607934ca-e6a9-43fd-bcc4-34fdaca01e32 req-9a8b21b9-9b22-4e96-bacb-eb52949d741e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received unexpected event network-vif-plugged-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.077 187212 DEBUG nova.network.neutron [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Updating instance_info_cache with network_info: [{"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.101 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Releasing lock "refresh_cache-854e3893-3908-4b4a-b29c-7fb4384e4f0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.101 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Instance network_info: |[{"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.102 187212 DEBUG oslo_concurrency.lockutils [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-854e3893-3908-4b4a-b29c-7fb4384e4f0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.102 187212 DEBUG nova.network.neutron [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Refreshing network info cache for port 1b4ab157-ddea-449c-ab91-983a53dd2045 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.105 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Start _get_guest_xml network_info=[{"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.109 187212 WARNING nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.115 187212 DEBUG nova.virt.libvirt.host [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.115 187212 DEBUG nova.virt.libvirt.host [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.122 187212 DEBUG nova.virt.libvirt.host [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.123 187212 DEBUG nova.virt.libvirt.host [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.123 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.124 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.124 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.125 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.125 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.125 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.126 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.126 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.126 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.127 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.127 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.127 187212 DEBUG nova.virt.hardware [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.131 187212 DEBUG nova.virt.libvirt.vif [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-63085993',display_name='tempest-ServerActionsTestOtherB-server-63085993',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-63085993',id=65,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-ruwsmmgi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:10Z,user_data=None,user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=854e3893-3908-4b4a-b29c-7fb4384e4f0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.131 187212 DEBUG nova.network.os_vif_util [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.132 187212 DEBUG nova.network.os_vif_util [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.133 187212 DEBUG nova.objects.instance [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 854e3893-3908-4b4a-b29c-7fb4384e4f0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.146 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  <uuid>854e3893-3908-4b4a-b29c-7fb4384e4f0c</uuid>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  <name>instance-00000041</name>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerActionsTestOtherB-server-63085993</nova:name>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:07:16</nova:creationTime>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:        <nova:user uuid="4ad1281afc874c0ca55d908d3a6e05a8">tempest-ServerActionsTestOtherB-1759520420-project-member</nova:user>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:        <nova:project uuid="58cbd93e463049988ccd6d013893e7d6">tempest-ServerActionsTestOtherB-1759520420</nova:project>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:        <nova:port uuid="1b4ab157-ddea-449c-ab91-983a53dd2045">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <entry name="serial">854e3893-3908-4b4a-b29c-7fb4384e4f0c</entry>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <entry name="uuid">854e3893-3908-4b4a-b29c-7fb4384e4f0c</entry>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.config"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:03:e5:0a"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <target dev="tap1b4ab157-dd"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/console.log" append="off"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:07:16 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:07:16 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:07:16 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:07:16 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.147 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Preparing to wait for external event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.147 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.147 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.148 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.148 187212 DEBUG nova.virt.libvirt.vif [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-63085993',display_name='tempest-ServerActionsTestOtherB-server-63085993',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-63085993',id=65,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-ruwsmmgi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:10Z,user_data=None,user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=854e3893-3908-4b4a-b29c-7fb4384e4f0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.148 187212 DEBUG nova.network.os_vif_util [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.149 187212 DEBUG nova.network.os_vif_util [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.149 187212 DEBUG os_vif [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.150 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.150 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.150 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.152 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.152 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b4ab157-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.153 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1b4ab157-dd, col_values=(('external_ids', {'iface-id': '1b4ab157-ddea-449c-ab91-983a53dd2045', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:e5:0a', 'vm-uuid': '854e3893-3908-4b4a-b29c-7fb4384e4f0c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.154 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:16 np0005546909 NetworkManager[55691]: <info>  [1764936436.1551] manager: (tap1b4ab157-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.157 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.167 187212 INFO os_vif [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd')#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.219 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.219 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.219 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No VIF found with MAC fa:16:3e:03:e5:0a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.220 187212 INFO nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Using config drive#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.413 187212 DEBUG nova.network.neutron [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.437 187212 INFO nova.compute.manager [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Took 1.35 seconds to deallocate network for instance.#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.511 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.511 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.643 187212 DEBUG nova.compute.provider_tree [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.663 187212 DEBUG nova.scheduler.client.report [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.671 187212 DEBUG nova.network.neutron [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.695 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.697 187212 INFO nova.compute.manager [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Took 3.05 seconds to deallocate network for instance.#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.732 187212 INFO nova.scheduler.client.report [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Deleted allocations for instance 8888dd78-1c78-4065-8536-9a1096bdf57b#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.758 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.759 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.797 187212 INFO nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Creating config drive at /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.config#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.802 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpokrksgkz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.846 187212 DEBUG oslo_concurrency.lockutils [None req-2f6c8d8c-2bf0-4d4a-bc55-4532460a9bf3 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.923 187212 DEBUG nova.compute.provider_tree [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.931 187212 DEBUG oslo_concurrency.processutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpokrksgkz" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.938 187212 DEBUG nova.scheduler.client.report [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.960 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:16 np0005546909 nova_compute[187208]: 2025-12-05 12:07:16.984 187212 INFO nova.scheduler.client.report [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Deleted allocations for instance bcdca3f9-3e24-4209-808c-8093b55e5c2d#033[00m
Dec  5 07:07:17 np0005546909 kernel: tap1b4ab157-dd: entered promiscuous mode
Dec  5 07:07:17 np0005546909 NetworkManager[55691]: <info>  [1764936437.0053] manager: (tap1b4ab157-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/226)
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.005 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:17Z|00542|binding|INFO|Claiming lport 1b4ab157-ddea-449c-ab91-983a53dd2045 for this chassis.
Dec  5 07:07:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:17Z|00543|binding|INFO|1b4ab157-ddea-449c-ab91-983a53dd2045: Claiming fa:16:3e:03:e5:0a 10.100.0.13
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.009 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.013 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:e5:0a 10.100.0.13'], port_security=['fa:16:3e:03:e5:0a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'df1c03c3-b3c9-47b6-a712-a13948dd510e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=1b4ab157-ddea-449c-ab91-983a53dd2045) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.015 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 1b4ab157-ddea-449c-ab91-983a53dd2045 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 bound to our chassis#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.016 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363#033[00m
Dec  5 07:07:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:17Z|00544|binding|INFO|Setting lport 1b4ab157-ddea-449c-ab91-983a53dd2045 ovn-installed in OVS
Dec  5 07:07:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:17Z|00545|binding|INFO|Setting lport 1b4ab157-ddea-449c-ab91-983a53dd2045 up in Southbound
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.021 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.022 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 systemd-udevd[228113]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.034 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[09c881e8-c8d7-440c-8778-880c5b951109]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:17 np0005546909 NetworkManager[55691]: <info>  [1764936437.0442] device (tap1b4ab157-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:07:17 np0005546909 NetworkManager[55691]: <info>  [1764936437.0452] device (tap1b4ab157-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:07:17 np0005546909 systemd-machined[153543]: New machine qemu-70-instance-00000041.
Dec  5 07:07:17 np0005546909 systemd[1]: Started Virtual Machine qemu-70-instance-00000041.
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.060 187212 DEBUG oslo_concurrency.lockutils [None req-85693629-9355-4735-8968-59777bb424f8 6b73160d333a43ed94d4258262e3c2b5 5285f99befb24ac285be8e4fc1d18e69 - - default default] Lock "bcdca3f9-3e24-4209-808c-8093b55e5c2d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.066 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9840555b-033c-405e-b9e7-536bd77589f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.069 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c75d8726-531e-4739-a33a-ce2c8a62547c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.097 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b43d4c15-ea72-47c2-9aaa-ea9bdfee420d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.114 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e57ec1bf-6b58-40b6-9bd0-87eeee58a93b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 22378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228126, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.129 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a611388-c324-46cf-ad3b-fc8bac68e3a2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228127, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228127, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.132 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.135 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.135 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.135 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.136 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.136 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.637 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.638 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.638 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.638 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.638 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.639 187212 INFO nova.compute.manager [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Terminating instance#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.640 187212 DEBUG nova.compute.manager [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:07:17 np0005546909 kernel: tap549318e9-e6 (unregistering): left promiscuous mode
Dec  5 07:07:17 np0005546909 NetworkManager[55691]: <info>  [1764936437.6703] device (tap549318e9-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.674 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:17Z|00546|binding|INFO|Releasing lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 from this chassis (sb_readonly=0)
Dec  5 07:07:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:17Z|00547|binding|INFO|Setting lport 549318e9-e629-4e2c-8cbb-3cd263c2bc34 down in Southbound
Dec  5 07:07:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:17Z|00548|binding|INFO|Removing iface tap549318e9-e6 ovn-installed in OVS
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.678 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.684 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:d7:ed 10.100.0.9'], port_security=['fa:16:3e:9b:d7:ed 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'cbcd4733-8c53-4696-9bc0-6e5c516c9dcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8f613c8797e432d96e43223fb7c476d', 'neutron:revision_number': '8', 'neutron:security_group_ids': '042f2e38-43a6-405e-ac82-b7fb12410d0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87579b50-ed4b-4ff4-b9d3-80f6bd4fa597, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=549318e9-e629-4e2c-8cbb-3cd263c2bc34) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.685 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 549318e9-e629-4e2c-8cbb-3cd263c2bc34 in datapath 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 unbound from our chassis#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.688 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.689 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dd928541-1a47-4d49-a25c-ff925e9a986d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.689 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 namespace which is not needed anymore#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.710 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000038.scope: Deactivated successfully.
Dec  5 07:07:17 np0005546909 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000038.scope: Consumed 13.561s CPU time.
Dec  5 07:07:17 np0005546909 systemd-machined[153543]: Machine qemu-69-instance-00000038 terminated.
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.755 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [NOTICE]   (225723) : haproxy version is 2.8.14-c23fe91
Dec  5 07:07:17 np0005546909 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [NOTICE]   (225723) : path to executable is /usr/sbin/haproxy
Dec  5 07:07:17 np0005546909 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [WARNING]  (225723) : Exiting Master process...
Dec  5 07:07:17 np0005546909 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [ALERT]    (225723) : Current worker (225725) exited with code 143 (Terminated)
Dec  5 07:07:17 np0005546909 neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63[225719]: [WARNING]  (225723) : All workers exited. Exiting... (0)
Dec  5 07:07:17 np0005546909 systemd[1]: libpod-912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b.scope: Deactivated successfully.
Dec  5 07:07:17 np0005546909 conmon[225719]: conmon 912e0eba89b1a71753b4 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b.scope/container/memory.events
Dec  5 07:07:17 np0005546909 podman[228150]: 2025-12-05 12:07:17.837154312 +0000 UTC m=+0.051440884 container died 912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.862 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b-userdata-shm.mount: Deactivated successfully.
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.868 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 systemd[1]: var-lib-containers-storage-overlay-a0675ec35d20a48a0477b2dc90980940bf1de49a39a39196259a264090cabf69-merged.mount: Deactivated successfully.
Dec  5 07:07:17 np0005546909 podman[228150]: 2025-12-05 12:07:17.879693016 +0000 UTC m=+0.093979568 container cleanup 912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec  5 07:07:17 np0005546909 systemd[1]: libpod-conmon-912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b.scope: Deactivated successfully.
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.913 187212 INFO nova.virt.libvirt.driver [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Instance destroyed successfully.#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.914 187212 DEBUG nova.objects.instance [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lazy-loading 'resources' on Instance uuid cbcd4733-8c53-4696-9bc0-6e5c516c9dcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.929 187212 DEBUG nova.virt.libvirt.vif [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1365452817',display_name='tempest-ListServerFiltersTestJSON-instance-1365452817',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1365452817',id=56,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8f613c8797e432d96e43223fb7c476d',ramdisk_id='',reservation_id='r-6r1u1q6j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-711798252',owner_user_name='tempest-ListServerFiltersTestJSON-711798252-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:41Z,user_data=None,user_id='4f8149b8192e411a9131b103b25862b6',uuid=cbcd4733-8c53-4696-9bc0-6e5c516c9dcf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.930 187212 DEBUG nova.network.os_vif_util [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converting VIF {"id": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "address": "fa:16:3e:9b:d7:ed", "network": {"id": "4a2d11fe-a91d-4cf5-bde7-283f0aa52f63", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-610444395-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8f613c8797e432d96e43223fb7c476d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap549318e9-e6", "ovs_interfaceid": "549318e9-e629-4e2c-8cbb-3cd263c2bc34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.930 187212 DEBUG nova.network.os_vif_util [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.931 187212 DEBUG os_vif [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.935 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.935 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap549318e9-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.937 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.938 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.941 187212 INFO os_vif [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:d7:ed,bridge_name='br-int',has_traffic_filtering=True,id=549318e9-e629-4e2c-8cbb-3cd263c2bc34,network=Network(4a2d11fe-a91d-4cf5-bde7-283f0aa52f63),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap549318e9-e6')#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.941 187212 INFO nova.virt.libvirt.driver [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Deleting instance files /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf_del#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.942 187212 INFO nova.virt.libvirt.driver [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Deletion of /var/lib/nova/instances/cbcd4733-8c53-4696-9bc0-6e5c516c9dcf_del complete#033[00m
Dec  5 07:07:17 np0005546909 podman[228189]: 2025-12-05 12:07:17.952963272 +0000 UTC m=+0.048832948 container remove 912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.959 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2e03c293-d85d-4840-aab1-563f7af25c3b]: (4, ('Fri Dec  5 12:07:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 (912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b)\n912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b\nFri Dec  5 12:07:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 (912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b)\n912e0eba89b1a71753b4cabb98e441a5097fdc23977b08dfebfa9a4ab12e779b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.961 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d25777-92c4-4734-8a98-996c50eac59f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.962 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a2d11fe-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.963 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 kernel: tap4a2d11fe-a0: left promiscuous mode
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.965 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:17Z|00549|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec  5 07:07:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:17Z|00550|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.968 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ca65f1fc-d6aa-4302-9fcf-2f029e3e26db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.986 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[54ec0e18-d977-4306-b295-9e08eb1b0d3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.987 187212 INFO nova.compute.manager [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.988 187212 DEBUG oslo.service.loopingcall [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:07:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:17.988 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f0828aca-894c-4ecd-a81c-70838f38470e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.988 187212 DEBUG nova.compute.manager [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.989 187212 DEBUG nova.network.neutron [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:07:17 np0005546909 nova_compute[187208]: 2025-12-05 12:07:17.996 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:18.005 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8613182e-29da-44c3-9f8a-f0fba6302573]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372582, 'reachable_time': 34004, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228206, 'error': None, 'target': 'ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:18 np0005546909 systemd[1]: run-netns-ovnmeta\x2d4a2d11fe\x2da91d\x2d4cf5\x2dbde7\x2d283f0aa52f63.mount: Deactivated successfully.
Dec  5 07:07:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:18.016 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4a2d11fe-a91d-4cf5-bde7-283f0aa52f63 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:07:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:18.016 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[894892ca-2ccb-403a-bd63-77736b91bc8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.087 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.105 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.106 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.106 187212 INFO nova.compute.manager [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Unshelving#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.188 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.189 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.194 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'pci_requests' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.209 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'numa_topology' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.222 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.223 187212 INFO nova.compute.claims [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.232 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936438.2324052, 854e3893-3908-4b4a-b29c-7fb4384e4f0c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.233 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] VM Started (Lifecycle Event)#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.257 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.262 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936438.233243, 854e3893-3908-4b4a-b29c-7fb4384e4f0c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.262 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.281 187212 DEBUG nova.network.neutron [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Updated VIF entry in instance network info cache for port 1b4ab157-ddea-449c-ab91-983a53dd2045. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.282 187212 DEBUG nova.network.neutron [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Updating instance_info_cache with network_info: [{"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.287 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.290 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.304 187212 DEBUG oslo_concurrency.lockutils [req-7136b3eb-4e68-4475-ad92-dda0fa3aacce req-e2d6fb89-1fcc-4d7d-81f5-808fbbd55e48 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-854e3893-3908-4b4a-b29c-7fb4384e4f0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.311 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.426 187212 DEBUG nova.compute.provider_tree [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.438 187212 DEBUG nova.scheduler.client.report [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:18 np0005546909 nova_compute[187208]: 2025-12-05 12:07:18.456 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.036 187212 INFO nova.network.neutron [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.270 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936424.2694447, 472c7e2c-bdad-4230-904b-6937ceb872d2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.270 187212 INFO nova.compute.manager [-] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.290 187212 DEBUG nova.compute.manager [None req-5290adf5-332a-4bd2-8732-23221d7e73ff - - - - - -] [instance: 472c7e2c-bdad-4230-904b-6937ceb872d2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.524 187212 DEBUG nova.compute.manager [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received event network-vif-deleted-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.525 187212 DEBUG nova.compute.manager [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.525 187212 DEBUG oslo_concurrency.lockutils [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.526 187212 DEBUG oslo_concurrency.lockutils [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.527 187212 DEBUG oslo_concurrency.lockutils [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.527 187212 DEBUG nova.compute.manager [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.528 187212 DEBUG nova.compute.manager [req-a73257b0-a141-4700-a127-ca677743f9b4 req-fb37d9dd-c2a3-4ae5-ad3a-9f38a13642fb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-unplugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.621 187212 DEBUG nova.network.neutron [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.645 187212 INFO nova.compute.manager [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Took 1.66 seconds to deallocate network for instance.#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.675 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received event network-vif-unplugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.675 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.676 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.676 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.677 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] No waiting events found dispatching network-vif-unplugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.677 187212 WARNING nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received unexpected event network-vif-unplugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.677 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.677 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.678 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.678 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "8888dd78-1c78-4065-8536-9a1096bdf57b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.678 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] No waiting events found dispatching network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.678 187212 WARNING nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Received unexpected event network-vif-plugged-c5cb68aa-e5c2-48b0-b9c4-e0542120e065 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.679 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Received event network-vif-deleted-88c7b630-e84b-4a35-8c8f-f934e7cabaf6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.679 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.679 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.679 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.680 187212 DEBUG oslo_concurrency.lockutils [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.680 187212 DEBUG nova.compute.manager [req-3c87ce76-5ea2-419e-af98-16a8b9e69d6d req-5ca2f00e-57f7-4e1e-9a5e-9ac49e610c7e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Processing event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.681 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.685 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936439.684806, 854e3893-3908-4b4a-b29c-7fb4384e4f0c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.685 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.687 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.883 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.883 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.884 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.887 187212 INFO nova.virt.libvirt.driver [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Instance spawned successfully.#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.888 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.900 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.910 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.910 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.911 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.911 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.912 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.912 187212 DEBUG nova.virt.libvirt.driver [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.924 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.990 187212 INFO nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Took 9.81 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:07:19 np0005546909 nova_compute[187208]: 2025-12-05 12:07:19.991 187212 DEBUG nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.029 187212 DEBUG nova.compute.provider_tree [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.039 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936425.0384374, 25918fc4-05ec-4a16-b77f-ca1d352a2763 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.040 187212 INFO nova.compute.manager [-] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.046 187212 DEBUG nova.scheduler.client.report [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.055 187212 INFO nova.compute.manager [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Took 10.44 seconds to build instance.#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.062 187212 DEBUG nova.compute.manager [None req-efafde38-a9ce-4ffe-a95f-1dd94f410d42 - - - - - -] [instance: 25918fc4-05ec-4a16-b77f-ca1d352a2763] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.069 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.073 187212 DEBUG oslo_concurrency.lockutils [None req-5c1b0c57-1233-4592-9149-47a2134f7dbd 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.775s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.089 187212 INFO nova.scheduler.client.report [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Deleted allocations for instance cbcd4733-8c53-4696-9bc0-6e5c516c9dcf#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.152 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.152 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.155 187212 DEBUG oslo_concurrency.lockutils [None req-628e628a-a09e-464f-96e7-b3b7813596dd 4f8149b8192e411a9131b103b25862b6 e8f613c8797e432d96e43223fb7c476d - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.176 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.244 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.245 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.250 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.250 187212 INFO nova.compute.claims [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.402 187212 DEBUG nova.compute.provider_tree [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.416 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.417 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.417 187212 DEBUG nova.network.neutron [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.419 187212 DEBUG nova.scheduler.client.report [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.484 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.484 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.528 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.529 187212 DEBUG nova.network.neutron [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.557 187212 INFO nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.572 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.657 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.659 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.659 187212 INFO nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Creating image(s)#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.660 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.660 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.661 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.679 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.752 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.753 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.754 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.764 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.825 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.826 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.866 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.867 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.868 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.924 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.925 187212 DEBUG nova.virt.disk.api [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Checking if we can resize image /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.926 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.986 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.987 187212 DEBUG nova.virt.disk.api [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Cannot resize image /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:07:20 np0005546909 nova_compute[187208]: 2025-12-05 12:07:20.988 187212 DEBUG nova.objects.instance [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'migration_context' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:21 np0005546909 nova_compute[187208]: 2025-12-05 12:07:21.001 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:07:21 np0005546909 nova_compute[187208]: 2025-12-05 12:07:21.002 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Ensure instance console log exists: /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:07:21 np0005546909 nova_compute[187208]: 2025-12-05 12:07:21.002 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:21 np0005546909 nova_compute[187208]: 2025-12-05 12:07:21.003 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:21 np0005546909 nova_compute[187208]: 2025-12-05 12:07:21.003 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:21 np0005546909 nova_compute[187208]: 2025-12-05 12:07:21.064 187212 DEBUG nova.policy [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '442a804e3368417d9de1636d533a25e0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:07:21 np0005546909 podman[228231]: 2025-12-05 12:07:21.215408813 +0000 UTC m=+0.068431587 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec  5 07:07:21 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:21Z|00551|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec  5 07:07:21 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:21Z|00552|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec  5 07:07:21 np0005546909 nova_compute[187208]: 2025-12-05 12:07:21.578 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:21 np0005546909 nova_compute[187208]: 2025-12-05 12:07:21.901 187212 DEBUG nova.network.neutron [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Successfully created port: d596fdf6-011f-43a4-bdb8-e76cc7302187 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.423 187212 INFO nova.compute.manager [None req-6d43865c-2a73-43b9-ac56-288219ab4719 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Get console output#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.515 213424 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.571 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936427.5699482, ed00d159-9d70-481e-93be-ea180fea04ba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.572 187212 INFO nova.compute.manager [-] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.595 187212 DEBUG nova.compute.manager [None req-84cb1d41-2932-4279-8526-d136a835395e - - - - - -] [instance: ed00d159-9d70-481e-93be-ea180fea04ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.623 187212 DEBUG nova.compute.manager [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.624 187212 DEBUG oslo_concurrency.lockutils [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.624 187212 DEBUG oslo_concurrency.lockutils [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.625 187212 DEBUG oslo_concurrency.lockutils [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "cbcd4733-8c53-4696-9bc0-6e5c516c9dcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.625 187212 DEBUG nova.compute.manager [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] No waiting events found dispatching network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.625 187212 WARNING nova.compute.manager [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received unexpected event network-vif-plugged-549318e9-e629-4e2c-8cbb-3cd263c2bc34 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.625 187212 DEBUG nova.compute.manager [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Received event network-vif-deleted-549318e9-e629-4e2c-8cbb-3cd263c2bc34 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.626 187212 DEBUG nova.compute.manager [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-changed-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.626 187212 DEBUG nova.compute.manager [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Refreshing instance network info cache due to event network-changed-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.626 187212 DEBUG oslo_concurrency.lockutils [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.647 187212 DEBUG nova.network.neutron [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.669 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.670 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.671 187212 INFO nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Creating image(s)#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.672 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.672 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.672 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.673 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.674 187212 DEBUG oslo_concurrency.lockutils [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.674 187212 DEBUG nova.network.neutron [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Refreshing network info cache for port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.696 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.697 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.705 187212 DEBUG nova.compute.manager [req-9ed37d06-ebfa-4332-89b7-7b75ba0d6ca3 req-76f24274-d107-46c7-997c-5379ea56ba58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.706 187212 DEBUG oslo_concurrency.lockutils [req-9ed37d06-ebfa-4332-89b7-7b75ba0d6ca3 req-76f24274-d107-46c7-997c-5379ea56ba58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.706 187212 DEBUG oslo_concurrency.lockutils [req-9ed37d06-ebfa-4332-89b7-7b75ba0d6ca3 req-76f24274-d107-46c7-997c-5379ea56ba58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.706 187212 DEBUG oslo_concurrency.lockutils [req-9ed37d06-ebfa-4332-89b7-7b75ba0d6ca3 req-76f24274-d107-46c7-997c-5379ea56ba58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.706 187212 DEBUG nova.compute.manager [req-9ed37d06-ebfa-4332-89b7-7b75ba0d6ca3 req-76f24274-d107-46c7-997c-5379ea56ba58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] No waiting events found dispatching network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.707 187212 WARNING nova.compute.manager [req-9ed37d06-ebfa-4332-89b7-7b75ba0d6ca3 req-76f24274-d107-46c7-997c-5379ea56ba58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received unexpected event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.793 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:22 np0005546909 nova_compute[187208]: 2025-12-05 12:07:22.937 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.003 187212 DEBUG nova.network.neutron [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Successfully updated port: d596fdf6-011f-43a4-bdb8-e76cc7302187 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.012 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.013 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.033 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.033 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquired lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.034 187212 DEBUG nova.network.neutron [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.043 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.116 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.117 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.126 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.126 187212 INFO nova.compute.claims [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.370 187212 DEBUG nova.compute.provider_tree [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.384 187212 DEBUG nova.scheduler.client.report [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.404 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.405 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.475 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.476 187212 DEBUG nova.network.neutron [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.494 187212 INFO nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.512 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.529 187212 DEBUG nova.network.neutron [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.617 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.619 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.620 187212 INFO nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Creating image(s)#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.620 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.621 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.622 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.637 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.705 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.706 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.708 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.719 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.795 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.797 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.829 187212 DEBUG nova.policy [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.839 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.840 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.841 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.942 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.943 187212 DEBUG nova.virt.disk.api [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Checking if we can resize image /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:07:23 np0005546909 nova_compute[187208]: 2025-12-05 12:07:23.944 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:24 np0005546909 nova_compute[187208]: 2025-12-05 12:07:24.004 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:24 np0005546909 nova_compute[187208]: 2025-12-05 12:07:24.005 187212 DEBUG nova.virt.disk.api [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Cannot resize image /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:07:24 np0005546909 nova_compute[187208]: 2025-12-05 12:07:24.006 187212 DEBUG nova.objects.instance [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'migration_context' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:24 np0005546909 nova_compute[187208]: 2025-12-05 12:07:24.034 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:07:24 np0005546909 nova_compute[187208]: 2025-12-05 12:07:24.035 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Ensure instance console log exists: /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:07:24 np0005546909 nova_compute[187208]: 2025-12-05 12:07:24.035 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:24 np0005546909 nova_compute[187208]: 2025-12-05 12:07:24.036 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:24 np0005546909 nova_compute[187208]: 2025-12-05 12:07:24.036 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.036 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.098 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.part --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.099 187212 DEBUG nova.virt.images [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] 13b862b8-8b0a-448a-bbba-7d8ef455d2c6 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.100 187212 DEBUG nova.privsep.utils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.101 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.part /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.518 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.part /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.converted" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.536 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.565 187212 DEBUG nova.compute.manager [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received event network-changed-d596fdf6-011f-43a4-bdb8-e76cc7302187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.566 187212 DEBUG nova.compute.manager [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Refreshing instance network info cache due to event network-changed-d596fdf6-011f-43a4-bdb8-e76cc7302187. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.567 187212 DEBUG oslo_concurrency.lockutils [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.590 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936430.5894876, b81bb939-d14f-4a72-b7fe-95fc5d8810a1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.590 187212 INFO nova.compute.manager [-] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.606 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89.converted --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.607 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.910s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.623 187212 DEBUG nova.compute.manager [None req-8820c450-8dd3-4246-9da0-dea368daa4ba - - - - - -] [instance: b81bb939-d14f-4a72-b7fe-95fc5d8810a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.624 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.688 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.690 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.691 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.704 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.769 187212 DEBUG nova.network.neutron [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully created port: f7a6775e-6d9c-48e1-91d7-829a6f5f3742 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.773 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.774 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89,backing_fmt=raw /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.809 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89,backing_fmt=raw /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.810 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.810 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.868 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.870 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'migration_context' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.887 187212 INFO nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Rebasing disk image.#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.887 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.909 187212 DEBUG nova.network.neutron [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updating instance_info_cache with network_info: [{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.930 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Releasing lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.931 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance network_info: |[{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.931 187212 DEBUG oslo_concurrency.lockutils [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.932 187212 DEBUG nova.network.neutron [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Refreshing network info cache for port d596fdf6-011f-43a4-bdb8-e76cc7302187 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.936 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Start _get_guest_xml network_info=[{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.941 187212 WARNING nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.945 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.946 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 -F raw /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.972 187212 DEBUG nova.virt.libvirt.host [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.973 187212 DEBUG nova.virt.libvirt.host [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.978 187212 DEBUG nova.virt.libvirt.host [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.979 187212 DEBUG nova.virt.libvirt.host [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.980 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.980 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.981 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.981 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.981 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.982 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.982 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.982 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.983 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.983 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.983 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.984 187212 DEBUG nova.virt.hardware [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.989 187212 DEBUG nova.virt.libvirt.vif [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1919324581',display_name='tempest-SecurityGroupsTestJSON-server-1919324581',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1919324581',id=66,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-yf09e02y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:20Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=39a36503-acd4-4199-89f3-2e714ef9e5c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.989 187212 DEBUG nova.network.os_vif_util [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.990 187212 DEBUG nova.network.os_vif_util [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:25 np0005546909 nova_compute[187208]: 2025-12-05 12:07:25.991 187212 DEBUG nova.objects.instance [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.009 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  <uuid>39a36503-acd4-4199-89f3-2e714ef9e5c5</uuid>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  <name>instance-00000042</name>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <nova:name>tempest-SecurityGroupsTestJSON-server-1919324581</nova:name>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:07:25</nova:creationTime>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:        <nova:user uuid="8db061f8c48141d1ac1c3216db1cc7f8">tempest-SecurityGroupsTestJSON-549628149-project-member</nova:user>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:        <nova:project uuid="442a804e3368417d9de1636d533a25e0">tempest-SecurityGroupsTestJSON-549628149</nova:project>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:        <nova:port uuid="d596fdf6-011f-43a4-bdb8-e76cc7302187">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <entry name="serial">39a36503-acd4-4199-89f3-2e714ef9e5c5</entry>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <entry name="uuid">39a36503-acd4-4199-89f3-2e714ef9e5c5</entry>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:20:58:3d"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <target dev="tapd596fdf6-01"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/console.log" append="off"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:07:26 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:07:26 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:07:26 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:07:26 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.010 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Preparing to wait for external event network-vif-plugged-d596fdf6-011f-43a4-bdb8-e76cc7302187 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.012 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.012 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.012 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.013 187212 DEBUG nova.virt.libvirt.vif [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1919324581',display_name='tempest-SecurityGroupsTestJSON-server-1919324581',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1919324581',id=66,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-yf09e02y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:20Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=39a36503-acd4-4199-89f3-2e714ef9e5c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.013 187212 DEBUG nova.network.os_vif_util [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.013 187212 DEBUG nova.network.os_vif_util [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.014 187212 DEBUG os_vif [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.014 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.015 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.015 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.017 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.018 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd596fdf6-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.018 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd596fdf6-01, col_values=(('external_ids', {'iface-id': 'd596fdf6-011f-43a4-bdb8-e76cc7302187', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:58:3d', 'vm-uuid': '39a36503-acd4-4199-89f3-2e714ef9e5c5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.019 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:26 np0005546909 NetworkManager[55691]: <info>  [1764936446.0209] manager: (tapd596fdf6-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.021 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.027 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.028 187212 INFO os_vif [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01')#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.097 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.097 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.098 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] No VIF found with MAC fa:16:3e:20:58:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.098 187212 INFO nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Using config drive#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.138 187212 DEBUG nova.network.neutron [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updated VIF entry in instance network info cache for port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.138 187212 DEBUG nova.network.neutron [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.154 187212 DEBUG oslo_concurrency.lockutils [req-07499796-a103-4295-950d-00075348d0d0 req-7d509bb2-eb4a-4538-b7dc-c242b041d87d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.808 187212 INFO nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Creating config drive at /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.813 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvcvvu7gy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:26 np0005546909 nova_compute[187208]: 2025-12-05 12:07:26.942 187212 DEBUG oslo_concurrency.processutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvcvvu7gy" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:27 np0005546909 kernel: tapd596fdf6-01: entered promiscuous mode
Dec  5 07:07:27 np0005546909 NetworkManager[55691]: <info>  [1764936447.0054] manager: (tapd596fdf6-01): new Tun device (/org/freedesktop/NetworkManager/Devices/228)
Dec  5 07:07:27 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:27Z|00553|binding|INFO|Claiming lport d596fdf6-011f-43a4-bdb8-e76cc7302187 for this chassis.
Dec  5 07:07:27 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:27Z|00554|binding|INFO|d596fdf6-011f-43a4-bdb8-e76cc7302187: Claiming fa:16:3e:20:58:3d 10.100.0.11
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.057 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.063 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:58:3d 10.100.0.11'], port_security=['fa:16:3e:20:58:3d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '39a36503-acd4-4199-89f3-2e714ef9e5c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd355bd0-560e-4b18-a504-3a5134c930f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '442a804e3368417d9de1636d533a25e0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fbf9a881-7958-4974-8ace-72447edf35a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67381b26-6b90-4d98-928b-9358d69f9e0c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d596fdf6-011f-43a4-bdb8-e76cc7302187) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.064 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d596fdf6-011f-43a4-bdb8-e76cc7302187 in datapath dd355bd0-560e-4b18-a504-3a5134c930f4 bound to our chassis#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.067 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd355bd0-560e-4b18-a504-3a5134c930f4#033[00m
Dec  5 07:07:27 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:27Z|00555|binding|INFO|Setting lport d596fdf6-011f-43a4-bdb8-e76cc7302187 ovn-installed in OVS
Dec  5 07:07:27 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:27Z|00556|binding|INFO|Setting lport d596fdf6-011f-43a4-bdb8-e76cc7302187 up in Southbound
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.075 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.079 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.085 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d7179543-d28b-4352-aecc-562524b9def1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:27 np0005546909 systemd-udevd[228323]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:07:27 np0005546909 NetworkManager[55691]: <info>  [1764936447.1037] device (tapd596fdf6-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:07:27 np0005546909 NetworkManager[55691]: <info>  [1764936447.1050] device (tapd596fdf6-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:07:27 np0005546909 systemd-machined[153543]: New machine qemu-71-instance-00000042.
Dec  5 07:07:27 np0005546909 systemd[1]: Started Virtual Machine qemu-71-instance-00000042.
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.130 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2546f613-dc25-4bb4-8d7e-4e00cbb73866]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.135 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f08c6432-5fda-4f6b-b532-86cdf52b0c72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.175 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ad6a4c-fdd6-4ede-99d4-a81fb64ea1b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.191 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9b190fec-aef9-4663-9736-994538c5a815]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd355bd0-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:03:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376697, 'reachable_time': 32536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228336, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.208 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[18302545-0aa0-4f1a-99fb-e9f9eee07412]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376710, 'tstamp': 376710}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228338, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376713, 'tstamp': 376713}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228338, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.211 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd355bd0-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.213 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.214 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.215 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd355bd0-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.215 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.216 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd355bd0-50, col_values=(('external_ids', {'iface-id': 'd5a54702-8e08-4aa4-aef4-19a0cc66763a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:27.216 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.548 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 -F raw /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk" returned: 0 in 1.602s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.549 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.549 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Ensure instance console log exists: /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.550 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.550 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.550 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.553 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Start _get_guest_xml network_info=[{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='28a29f9d03a8dff023fa9db5bc7f166e',container_format='bare',created_at=2025-12-05T12:06:56Z,direct_url=<?>,disk_format='qcow2',id=13b862b8-8b0a-448a-bbba-7d8ef455d2c6,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-795100487-shelved',owner='6d62df5807554f499d26b5fc77ec8603',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-12-05T12:07:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.560 187212 WARNING nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.565 187212 DEBUG nova.virt.libvirt.host [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.566 187212 DEBUG nova.virt.libvirt.host [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.569 187212 DEBUG nova.virt.libvirt.host [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.569 187212 DEBUG nova.virt.libvirt.host [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.569 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.569 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='28a29f9d03a8dff023fa9db5bc7f166e',container_format='bare',created_at=2025-12-05T12:06:56Z,direct_url=<?>,disk_format='qcow2',id=13b862b8-8b0a-448a-bbba-7d8ef455d2c6,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-795100487-shelved',owner='6d62df5807554f499d26b5fc77ec8603',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-12-05T12:07:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.570 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.570 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.570 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.570 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.571 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.571 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.571 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.571 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.572 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.572 187212 DEBUG nova.virt.hardware [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.572 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.636 187212 DEBUG nova.virt.libvirt.vif [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:06:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-795100487',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-795100487',id=62,image_ref='13b862b8-8b0a-448a-bbba-7d8ef455d2c6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-105541899',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='6d62df5807554f499d26b5fc77ec8603',ramdisk_id='',reservation_id='r-zgvbze4r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-1858452545',owner_user_name='tempest-AttachVolumeShelveTestJSON-1858452545-project-member',shelved_at='2025-12-05T12:07:04.112687',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='13b862b8-8b0a-448a-bbba-7d8ef455d2c6'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc4332be3b424a5e996b61b244505cfc',uuid=5d70ac2d-111f-4e1b-ac26-3e02849b0458,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.637 187212 DEBUG nova.network.os_vif_util [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converting VIF {"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.637 187212 DEBUG nova.network.os_vif_util [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.638 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.795 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.910 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  <uuid>5d70ac2d-111f-4e1b-ac26-3e02849b0458</uuid>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  <name>instance-0000003e</name>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-795100487</nova:name>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:07:27</nova:creationTime>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:        <nova:user uuid="bc4332be3b424a5e996b61b244505cfc">tempest-AttachVolumeShelveTestJSON-1858452545-project-member</nova:user>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:        <nova:project uuid="6d62df5807554f499d26b5fc77ec8603">tempest-AttachVolumeShelveTestJSON-1858452545</nova:project>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="13b862b8-8b0a-448a-bbba-7d8ef455d2c6"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:        <nova:port uuid="ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <entry name="serial">5d70ac2d-111f-4e1b-ac26-3e02849b0458</entry>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <entry name="uuid">5d70ac2d-111f-4e1b-ac26-3e02849b0458</entry>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:6a:c5:99"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <target dev="tapac02dd63-5a"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/console.log" append="off"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <input type="keyboard" bus="usb"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:07:27 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:07:27 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:07:27 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:07:27 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.910 187212 DEBUG nova.compute.manager [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Preparing to wait for external event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.910 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.911 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.911 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.912 187212 DEBUG nova.virt.libvirt.vif [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:06:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-795100487',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-795100487',id=62,image_ref='13b862b8-8b0a-448a-bbba-7d8ef455d2c6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-105541899',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='6d62df5807554f499d26b5fc77ec8603',ramdisk_id='',reservation_id='r-zgvbze4r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-1858452545',owner_user_name='tempest-AttachVolumeShelveTestJSON-1858452545-project-member',shelved_at='2025-12-05T12:07:04.112687',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='13b862b8-8b0a-448a-bbba-7d8ef455d2c6'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc4332be3b424a5e996b61b244505cfc',uuid=5d70ac2d-111f-4e1b-ac26-3e02849b0458,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.912 187212 DEBUG nova.network.os_vif_util [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converting VIF {"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.912 187212 DEBUG nova.network.os_vif_util [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.913 187212 DEBUG os_vif [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.913 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.914 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.917 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac02dd63-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.917 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapac02dd63-5a, col_values=(('external_ids', {'iface-id': 'ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:c5:99', 'vm-uuid': '5d70ac2d-111f-4e1b-ac26-3e02849b0458'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.919 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:27 np0005546909 NetworkManager[55691]: <info>  [1764936447.9198] manager: (tapac02dd63-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.922 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.926 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:27 np0005546909 nova_compute[187208]: 2025-12-05 12:07:27.926 187212 INFO os_vif [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a')#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.155 187212 DEBUG nova.network.neutron [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully updated port: f7a6775e-6d9c-48e1-91d7-829a6f5f3742 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.164 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.165 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.167 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] No VIF found with MAC fa:16:3e:6a:c5:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.168 187212 INFO nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Using config drive#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.211 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936448.2097366, 39a36503-acd4-4199-89f3-2e714ef9e5c5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.211 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] VM Started (Lifecycle Event)#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.214 187212 DEBUG nova.network.neutron [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updated VIF entry in instance network info cache for port d596fdf6-011f-43a4-bdb8-e76cc7302187. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.214 187212 DEBUG nova.network.neutron [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updating instance_info_cache with network_info: [{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:28 np0005546909 podman[228349]: 2025-12-05 12:07:28.219543271 +0000 UTC m=+0.066254564 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 07:07:28 np0005546909 podman[228348]: 2025-12-05 12:07:28.229796208 +0000 UTC m=+0.078686124 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9)
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.324 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936433.3228905, bcdca3f9-3e24-4209-808c-8093b55e5c2d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.324 187212 INFO nova.compute.manager [-] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.371 187212 DEBUG nova.compute.manager [req-3be979e3-3232-4271-8b6b-53779fb3f3cd req-ac99f4a4-7cf2-44d3-8e65-a24d01eea0e7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received event network-vif-plugged-d596fdf6-011f-43a4-bdb8-e76cc7302187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.371 187212 DEBUG oslo_concurrency.lockutils [req-3be979e3-3232-4271-8b6b-53779fb3f3cd req-ac99f4a4-7cf2-44d3-8e65-a24d01eea0e7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.371 187212 DEBUG oslo_concurrency.lockutils [req-3be979e3-3232-4271-8b6b-53779fb3f3cd req-ac99f4a4-7cf2-44d3-8e65-a24d01eea0e7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.372 187212 DEBUG oslo_concurrency.lockutils [req-3be979e3-3232-4271-8b6b-53779fb3f3cd req-ac99f4a4-7cf2-44d3-8e65-a24d01eea0e7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.372 187212 DEBUG nova.compute.manager [req-3be979e3-3232-4271-8b6b-53779fb3f3cd req-ac99f4a4-7cf2-44d3-8e65-a24d01eea0e7 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Processing event network-vif-plugged-d596fdf6-011f-43a4-bdb8-e76cc7302187 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.373 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.376 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.379 187212 INFO nova.virt.libvirt.driver [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance spawned successfully.#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.379 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.442 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.446 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.446 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.446 187212 DEBUG nova.network.neutron [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.448 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.449 187212 DEBUG oslo_concurrency.lockutils [req-92236ebe-f863-493e-9888-c4e7e6d36477 req-d235a098-1898-44fc-826a-11156aa72289 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.451 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.503 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.504 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936448.2099152, 39a36503-acd4-4199-89f3-2e714ef9e5c5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.504 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.510 187212 DEBUG nova.compute.manager [None req-815fa428-c21e-488f-96f0-4b03e6268301 - - - - - -] [instance: bcdca3f9-3e24-4209-808c-8093b55e5c2d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.518 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.518 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.519 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.519 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.519 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.520 187212 DEBUG nova.virt.libvirt.driver [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.524 187212 DEBUG nova.objects.instance [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'keypairs' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.526 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.529 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936448.381738, 39a36503-acd4-4199-89f3-2e714ef9e5c5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.529 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.549 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.557 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.574 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.581 187212 INFO nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Took 7.92 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.581 187212 DEBUG nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.634 187212 INFO nova.compute.manager [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Took 8.41 seconds to build instance.#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.648 187212 DEBUG oslo_concurrency.lockutils [None req-2ed6d5d6-08d9-4f32-bbba-3e7667e330c9 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.496s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.845 187212 DEBUG nova.network.neutron [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.884 187212 INFO nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Creating config drive at /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config#033[00m
Dec  5 07:07:28 np0005546909 nova_compute[187208]: 2025-12-05 12:07:28.890 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4hv2bkec execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:29 np0005546909 nova_compute[187208]: 2025-12-05 12:07:29.018 187212 DEBUG oslo_concurrency.processutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4hv2bkec" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:29 np0005546909 kernel: tapac02dd63-5a: entered promiscuous mode
Dec  5 07:07:29 np0005546909 NetworkManager[55691]: <info>  [1764936449.0866] manager: (tapac02dd63-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/230)
Dec  5 07:07:29 np0005546909 systemd-udevd[228327]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:07:29 np0005546909 nova_compute[187208]: 2025-12-05 12:07:29.087 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:29 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:29Z|00557|binding|INFO|Claiming lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for this chassis.
Dec  5 07:07:29 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:29Z|00558|binding|INFO|ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b: Claiming fa:16:3e:6a:c5:99 10.100.0.8
Dec  5 07:07:29 np0005546909 nova_compute[187208]: 2025-12-05 12:07:29.095 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:29 np0005546909 NetworkManager[55691]: <info>  [1764936449.1034] device (tapac02dd63-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:07:29 np0005546909 NetworkManager[55691]: <info>  [1764936449.1044] device (tapac02dd63-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:07:29 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:29Z|00559|binding|INFO|Setting lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b ovn-installed in OVS
Dec  5 07:07:29 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:29Z|00560|binding|INFO|Setting lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b up in Southbound
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.108 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:c5:99 10.100.0.8'], port_security=['fa:16:3e:6a:c5:99 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d62df5807554f499d26b5fc77ec8603', 'neutron:revision_number': '7', 'neutron:security_group_ids': '5a04f4af-e81b-4661-95ed-5737ffc98cae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a7d298f-265e-44c5-a73a-18dd9ed0b171, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.109 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b in datapath fc6ce614-d0f7-413f-bc3e-26f7271993d9 bound to our chassis#033[00m
Dec  5 07:07:29 np0005546909 nova_compute[187208]: 2025-12-05 12:07:29.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:29 np0005546909 nova_compute[187208]: 2025-12-05 12:07:29.111 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.113 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc6ce614-d0f7-413f-bc3e-26f7271993d9#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.123 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b24dc6d3-79dd-4ed8-993b-4e469bb568c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.124 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfc6ce614-d1 in ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.126 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfc6ce614-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.126 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d1900567-7792-46e3-a080-2336e1eaba9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.127 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[46a14a7b-f6e6-40e1-962d-02a62b3117d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 systemd-machined[153543]: New machine qemu-72-instance-0000003e.
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.142 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b0fa45b0-c070-4c49-8273-586640ae8f05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 systemd[1]: Started Virtual Machine qemu-72-instance-0000003e.
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.164 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[43daa7d4-ecd8-4139-9f89-19148a6240f4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.200 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1b1bd4aa-fbdc-42f7-b620-03c9082fdd58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.213 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6098934c-c22e-4f1d-b88c-e671619d479b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 NetworkManager[55691]: <info>  [1764936449.2178] manager: (tapfc6ce614-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/231)
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.249 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa64d84-8463-400f-8ca0-c2f357a0ebd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.252 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[259513ff-8b04-4a34-b61a-984efa0f7cf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 NetworkManager[55691]: <info>  [1764936449.2838] device (tapfc6ce614-d0): carrier: link connected
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.291 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[852bfbe5-67b3-412a-afe8-db8cfbab04be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.310 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[00fad784-ba8f-4520-af9e-74d3b90a6242]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc6ce614-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:6b:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382984, 'reachable_time': 22274, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228437, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.324 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f6217ad-57ce-40d5-8cc8-6f0961d1b6b8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:6b90'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382984, 'tstamp': 382984}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228438, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.351 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2cabaf7c-e0d5-4151-97f0-d0eec0ee0274]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc6ce614-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:6b:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382984, 'reachable_time': 22274, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228439, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.383 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[18f3c71b-1ed8-4ad2-a57c-51568d7d8c96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.464 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[97ae0480-6004-4017-85f8-f3e3ed288acf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.465 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc6ce614-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.465 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.465 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc6ce614-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:29 np0005546909 nova_compute[187208]: 2025-12-05 12:07:29.467 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:29 np0005546909 NetworkManager[55691]: <info>  [1764936449.4677] manager: (tapfc6ce614-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Dec  5 07:07:29 np0005546909 kernel: tapfc6ce614-d0: entered promiscuous mode
Dec  5 07:07:29 np0005546909 nova_compute[187208]: 2025-12-05 12:07:29.470 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.472 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc6ce614-d0, col_values=(('external_ids', {'iface-id': '1b193bb7-c39e-445c-9a2c-dd8ee58553b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:29 np0005546909 nova_compute[187208]: 2025-12-05 12:07:29.474 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:29 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:29Z|00561|binding|INFO|Releasing lport 1b193bb7-c39e-445c-9a2c-dd8ee58553b9 from this chassis (sb_readonly=0)
Dec  5 07:07:29 np0005546909 nova_compute[187208]: 2025-12-05 12:07:29.475 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.475 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fc6ce614-d0f7-413f-bc3e-26f7271993d9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fc6ce614-d0f7-413f-bc3e-26f7271993d9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.476 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[432b1155-12fe-4d0d-b8c7-a2d1bc43e8ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.477 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-fc6ce614-d0f7-413f-bc3e-26f7271993d9
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/fc6ce614-d0f7-413f-bc3e-26f7271993d9.pid.haproxy
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID fc6ce614-d0f7-413f-bc3e-26f7271993d9
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:07:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:29.478 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'env', 'PROCESS_TAG=haproxy-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fc6ce614-d0f7-413f-bc3e-26f7271993d9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:07:29 np0005546909 nova_compute[187208]: 2025-12-05 12:07:29.487 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:29 np0005546909 podman[228472]: 2025-12-05 12:07:29.885780582 +0000 UTC m=+0.059190478 container create 770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  5 07:07:29 np0005546909 systemd[1]: Started libpod-conmon-770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f.scope.
Dec  5 07:07:29 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:07:29 np0005546909 podman[228472]: 2025-12-05 12:07:29.853502336 +0000 UTC m=+0.026912262 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:07:29 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/609c1c748c26ab3742ccbcbaed3a0fb9e3b7ac74e56bc02b438dfce85dc57371/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:07:29 np0005546909 podman[228472]: 2025-12-05 12:07:29.995367832 +0000 UTC m=+0.168777728 container init 770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2)
Dec  5 07:07:30 np0005546909 podman[228472]: 2025-12-05 12:07:30.004891589 +0000 UTC m=+0.178301485 container start 770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  5 07:07:30 np0005546909 nova_compute[187208]: 2025-12-05 12:07:30.010 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936435.0093517, 8888dd78-1c78-4065-8536-9a1096bdf57b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:30 np0005546909 nova_compute[187208]: 2025-12-05 12:07:30.011 187212 INFO nova.compute.manager [-] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:07:30 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [NOTICE]   (228491) : New worker (228493) forked
Dec  5 07:07:30 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [NOTICE]   (228491) : Loading success.
Dec  5 07:07:30 np0005546909 nova_compute[187208]: 2025-12-05 12:07:30.526 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936450.5262105, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:30 np0005546909 nova_compute[187208]: 2025-12-05 12:07:30.527 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Started (Lifecycle Event)#033[00m
Dec  5 07:07:30 np0005546909 nova_compute[187208]: 2025-12-05 12:07:30.730 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:30 np0005546909 nova_compute[187208]: 2025-12-05 12:07:30.731 187212 DEBUG nova.compute.manager [None req-853b82b1-d298-4fbc-81ac-8e0255a73d3b - - - - - -] [instance: 8888dd78-1c78-4065-8536-9a1096bdf57b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:30 np0005546909 nova_compute[187208]: 2025-12-05 12:07:30.734 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936450.5295753, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:30 np0005546909 nova_compute[187208]: 2025-12-05 12:07:30.735 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:07:31 np0005546909 nova_compute[187208]: 2025-12-05 12:07:31.246 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:31 np0005546909 nova_compute[187208]: 2025-12-05 12:07:31.251 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:07:31 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:31Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:e5:0a 10.100.0.13
Dec  5 07:07:31 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:31Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:e5:0a 10.100.0.13
Dec  5 07:07:31 np0005546909 nova_compute[187208]: 2025-12-05 12:07:31.813 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:07:32 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:32Z|00562|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec  5 07:07:32 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:32Z|00563|binding|INFO|Releasing lport 1b193bb7-c39e-445c-9a2c-dd8ee58553b9 from this chassis (sb_readonly=0)
Dec  5 07:07:32 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:32Z|00564|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.761 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.797 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.873 187212 DEBUG nova.network.neutron [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.912 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936437.9086945, cbcd4733-8c53-4696-9bc0-6e5c516c9dcf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.912 187212 INFO nova.compute.manager [-] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.915 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.916 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Instance network_info: |[{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.918 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Start _get_guest_xml network_info=[{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.920 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.925 187212 WARNING nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.931 187212 DEBUG nova.virt.libvirt.host [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.933 187212 DEBUG nova.virt.libvirt.host [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.938 187212 DEBUG nova.virt.libvirt.host [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.939 187212 DEBUG nova.virt.libvirt.host [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.939 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.940 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.940 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.941 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.941 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.942 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.942 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.943 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.944 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.944 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.945 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.946 187212 DEBUG nova.virt.hardware [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.953 187212 DEBUG nova.virt.libvirt.vif [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.954 187212 DEBUG nova.network.os_vif_util [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.955 187212 DEBUG nova.network.os_vif_util [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.957 187212 DEBUG nova.objects.instance [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_devices' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.960 187212 DEBUG nova.compute.manager [None req-ebb36257-f0c5-4f43-a28e-1f4d3fc15521 - - - - - -] [instance: cbcd4733-8c53-4696-9bc0-6e5c516c9dcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.972 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  <uuid>f1e72d05-87e7-495d-9dbb-1a10b112c69f</uuid>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  <name>instance-00000043</name>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:07:32</nova:creationTime>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:        <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:        <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:        <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <entry name="serial">f1e72d05-87e7-495d-9dbb-1a10b112c69f</entry>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <entry name="uuid">f1e72d05-87e7-495d-9dbb-1a10b112c69f</entry>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.config"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:01:99:b0"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <target dev="tapf7a6775e-6d"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/console.log" append="off"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:07:32 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:07:32 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:07:32 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:07:32 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.978 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Preparing to wait for external event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.978 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.978 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.979 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.979 187212 DEBUG nova.virt.libvirt.vif [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.980 187212 DEBUG nova.network.os_vif_util [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.981 187212 DEBUG nova.network.os_vif_util [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.981 187212 DEBUG os_vif [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.982 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.983 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.986 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.986 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a6775e-6d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.987 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf7a6775e-6d, col_values=(('external_ids', {'iface-id': 'f7a6775e-6d9c-48e1-91d7-829a6f5f3742', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:99:b0', 'vm-uuid': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.988 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:32 np0005546909 NetworkManager[55691]: <info>  [1764936452.9895] manager: (tapf7a6775e-6d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.990 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.996 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:32 np0005546909 nova_compute[187208]: 2025-12-05 12:07:32.997 187212 INFO os_vif [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d')#033[00m
Dec  5 07:07:33 np0005546909 nova_compute[187208]: 2025-12-05 12:07:33.050 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:07:33 np0005546909 nova_compute[187208]: 2025-12-05 12:07:33.051 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:07:33 np0005546909 nova_compute[187208]: 2025-12-05 12:07:33.051 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:01:99:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:07:33 np0005546909 nova_compute[187208]: 2025-12-05 12:07:33.051 187212 INFO nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Using config drive#033[00m
Dec  5 07:07:33 np0005546909 nova_compute[187208]: 2025-12-05 12:07:33.303 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-changed-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:33 np0005546909 nova_compute[187208]: 2025-12-05 12:07:33.304 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing instance network info cache due to event network-changed-f7a6775e-6d9c-48e1-91d7-829a6f5f3742. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:07:33 np0005546909 nova_compute[187208]: 2025-12-05 12:07:33.304 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:33 np0005546909 nova_compute[187208]: 2025-12-05 12:07:33.305 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:33 np0005546909 nova_compute[187208]: 2025-12-05 12:07:33.305 187212 DEBUG nova.network.neutron [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing network info cache for port f7a6775e-6d9c-48e1-91d7-829a6f5f3742 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:07:33 np0005546909 nova_compute[187208]: 2025-12-05 12:07:33.801 187212 INFO nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Creating config drive at /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.config#033[00m
Dec  5 07:07:33 np0005546909 nova_compute[187208]: 2025-12-05 12:07:33.808 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph3wf633v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:33 np0005546909 nova_compute[187208]: 2025-12-05 12:07:33.939 187212 DEBUG oslo_concurrency.processutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph3wf633v" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:34 np0005546909 NetworkManager[55691]: <info>  [1764936454.0304] manager: (tapf7a6775e-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/234)
Dec  5 07:07:34 np0005546909 kernel: tapf7a6775e-6d: entered promiscuous mode
Dec  5 07:07:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:34Z|00565|binding|INFO|Claiming lport f7a6775e-6d9c-48e1-91d7-829a6f5f3742 for this chassis.
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.079 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:34Z|00566|binding|INFO|f7a6775e-6d9c-48e1-91d7-829a6f5f3742: Claiming fa:16:3e:01:99:b0 10.100.0.7
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.087 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:99:b0 10.100.0.7'], port_security=['fa:16:3e:01:99:b0 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '83c79c65-073e-4860-a990-92e9abafc0bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f7a6775e-6d9c-48e1-91d7-829a6f5f3742) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.088 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f7a6775e-6d9c-48e1-91d7-829a6f5f3742 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.090 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c#033[00m
Dec  5 07:07:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:34Z|00567|binding|INFO|Setting lport f7a6775e-6d9c-48e1-91d7-829a6f5f3742 ovn-installed in OVS
Dec  5 07:07:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:34Z|00568|binding|INFO|Setting lport f7a6775e-6d9c-48e1-91d7-829a6f5f3742 up in Southbound
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.100 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.102 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[168d68e2-18b1-4e6c-b97f-38bfcbcefa89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.103 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbfed6fc-31 in ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.104 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.106 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbfed6fc-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.106 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7e77ab2c-f126-487d-befb-65ab38c31b3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.107 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb789ad-fb57-4046-8767-a8540f71e515]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 systemd-machined[153543]: New machine qemu-73-instance-00000043.
Dec  5 07:07:34 np0005546909 systemd-udevd[228593]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:07:34 np0005546909 systemd[1]: Started Virtual Machine qemu-73-instance-00000043.
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.120 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[da03d46c-8f1c-4082-9bed-fa0a5669206c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 NetworkManager[55691]: <info>  [1764936454.1369] device (tapf7a6775e-6d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:07:34 np0005546909 NetworkManager[55691]: <info>  [1764936454.1382] device (tapf7a6775e-6d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:07:34 np0005546909 podman[228546]: 2025-12-05 12:07:34.143658297 +0000 UTC m=+0.127948913 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.152 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c1b84b-9ea3-43ce-80ee-94188ff59cda]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 podman[228548]: 2025-12-05 12:07:34.155598944 +0000 UTC m=+0.140872749 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.184 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bf8768fd-cbaf-407e-a55b-3e8a868fbc6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 NetworkManager[55691]: <info>  [1764936454.1971] manager: (tapfbfed6fc-30): new Veth device (/org/freedesktop/NetworkManager/Devices/235)
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.196 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cc089378-684b-4bbe-a7df-25c94f67d3ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.229 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4f756c58-347a-423b-bf33-fe26a0676ff5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.233 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b153919e-83f0-4297-a643-6a4f8a5e7e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 NetworkManager[55691]: <info>  [1764936454.2586] device (tapfbfed6fc-30): carrier: link connected
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.265 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[59e43858-6186-4b26-945a-f96fce4155c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.283 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9bf408-f233-475f-95bd-c1a8ad227ef5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 35716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228639, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.297 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[945c38f2-2c3e-4beb-898c-4e963aaf672f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe26:8872'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383482, 'tstamp': 383482}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228640, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.314 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[df3024f8-0270-4e53-93d5-5eb1e401f076]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 35716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228641, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.347 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cb6f01e0-c185-4132-b757-0f7e67f95bc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.418 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5559a484-0036-460d-b826-1bef9768ac7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.420 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.420 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.421 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:34 np0005546909 NetworkManager[55691]: <info>  [1764936454.4256] manager: (tapfbfed6fc-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Dec  5 07:07:34 np0005546909 kernel: tapfbfed6fc-30: entered promiscuous mode
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.425 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.427 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:34Z|00569|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.430 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.434 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ba0eb845-ead2-4d37-904c-d2aa82e112db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.434 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:07:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:34.435 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'env', 'PROCESS_TAG=haproxy-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.443 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.445 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936454.4443576, f1e72d05-87e7-495d-9dbb-1a10b112c69f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.445 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] VM Started (Lifecycle Event)#033[00m
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.470 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.475 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936454.4447885, f1e72d05-87e7-495d-9dbb-1a10b112c69f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.476 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.503 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.506 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:07:34 np0005546909 nova_compute[187208]: 2025-12-05 12:07:34.536 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:07:34 np0005546909 podman[228681]: 2025-12-05 12:07:34.865663879 +0000 UTC m=+0.053371170 container create 9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 07:07:34 np0005546909 systemd[1]: Started libpod-conmon-9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b.scope.
Dec  5 07:07:34 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:07:34 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/186f8522ea6206e72ff71431d61cc132ff2e048571a25807429a54cf15a146be/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:07:34 np0005546909 podman[228681]: 2025-12-05 12:07:34.834604908 +0000 UTC m=+0.022312229 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:07:34 np0005546909 podman[228681]: 2025-12-05 12:07:34.941378226 +0000 UTC m=+0.129085557 container init 9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  5 07:07:34 np0005546909 podman[228681]: 2025-12-05 12:07:34.946771683 +0000 UTC m=+0.134478984 container start 9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:07:34 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [NOTICE]   (228701) : New worker (228703) forked
Dec  5 07:07:34 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [NOTICE]   (228701) : Loading success.
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.125 187212 DEBUG nova.network.neutron [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updated VIF entry in instance network info cache for port f7a6775e-6d9c-48e1-91d7-829a6f5f3742. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.125 187212 DEBUG nova.network.neutron [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.173 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.174 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received event network-vif-plugged-d596fdf6-011f-43a4-bdb8-e76cc7302187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.174 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.175 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.175 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.175 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] No waiting events found dispatching network-vif-plugged-d596fdf6-011f-43a4-bdb8-e76cc7302187 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.175 187212 WARNING nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received unexpected event network-vif-plugged-d596fdf6-011f-43a4-bdb8-e76cc7302187 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.176 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.176 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.176 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.176 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.177 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Processing event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.177 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.177 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.177 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.178 187212 DEBUG oslo_concurrency.lockutils [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.178 187212 DEBUG nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] No waiting events found dispatching network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.178 187212 WARNING nova.compute.manager [req-69e5cd2a-2b1f-497a-bd06-d12144431b92 req-9e140847-e09a-4036-82aa-0a3de369dd5c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received unexpected event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.179 187212 DEBUG nova.compute.manager [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.184 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936455.1841257, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.184 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.186 187212 DEBUG nova.virt.libvirt.driver [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.195 187212 INFO nova.virt.libvirt.driver [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance spawned successfully.#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.206 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.211 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:07:35 np0005546909 nova_compute[187208]: 2025-12-05 12:07:35.256 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.078 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "5659bd52-8c24-483d-80a4-8eb6b28e1349" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.078 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.100 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.234 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.235 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.244 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.244 187212 INFO nova.compute.claims [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.461 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.589 187212 DEBUG nova.compute.provider_tree [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.610 187212 DEBUG nova.scheduler.client.report [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.651 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.652 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.681 187212 DEBUG nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.683 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.683 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.683 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.684 187212 DEBUG nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Processing event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.684 187212 DEBUG nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received event network-changed-d596fdf6-011f-43a4-bdb8-e76cc7302187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.684 187212 DEBUG nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Refreshing instance network info cache due to event network-changed-d596fdf6-011f-43a4-bdb8-e76cc7302187. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.684 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.685 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.685 187212 DEBUG nova.network.neutron [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Refreshing network info cache for port d596fdf6-011f-43a4-bdb8-e76cc7302187 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.686 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.690 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.692 187212 INFO nova.virt.libvirt.driver [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Instance spawned successfully.#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.693 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.696 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936456.6958237, f1e72d05-87e7-495d-9dbb-1a10b112c69f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.696 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.707 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.707 187212 DEBUG nova.network.neutron [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.720 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.720 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.721 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.721 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.722 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.722 187212 DEBUG nova.virt.libvirt.driver [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.728 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.731 187212 INFO nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.735 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.765 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.766 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.820 187212 INFO nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Took 13.20 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.821 187212 DEBUG nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.897 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.899 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.899 187212 INFO nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Creating image(s)#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.900 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "/var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.900 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.901 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.923 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.961 187212 INFO nova.compute.manager [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Took 13.87 seconds to build instance.#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.990 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.991 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.991 187212 INFO nova.compute.manager [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Rebooting instance#033[00m
Dec  5 07:07:36 np0005546909 nova_compute[187208]: 2025-12-05 12:07:36.993 187212 DEBUG oslo_concurrency.lockutils [None req-e1bbb220-aefe-4c68-bc3d-c284874b62af 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.980s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.009 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.012 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.013 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.014 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.015 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.032 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.099 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.100 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.144 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.145 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.146 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.213 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.216 187212 DEBUG nova.virt.disk.api [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Checking if we can resize image /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.216 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.251 187212 DEBUG nova.policy [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.287 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.293 187212 DEBUG nova.virt.disk.api [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Cannot resize image /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.294 187212 DEBUG nova.objects.instance [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'migration_context' on Instance uuid 5659bd52-8c24-483d-80a4-8eb6b28e1349 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.310 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.311 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Ensure instance console log exists: /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.311 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.312 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.312 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.402 187212 DEBUG nova.compute.manager [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.616 187212 DEBUG oslo_concurrency.lockutils [None req-bf976578-0800-428e-bd52-ffbe5d7dcaec bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 19.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.837 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:37 np0005546909 nova_compute[187208]: 2025-12-05 12:07:37.989 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:38 np0005546909 podman[228728]: 2025-12-05 12:07:38.212691344 +0000 UTC m=+0.059257941 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 07:07:38 np0005546909 nova_compute[187208]: 2025-12-05 12:07:38.768 187212 DEBUG nova.network.neutron [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Successfully created port: 29e412e9-d3cc-4af2-b85a-ab48fcad0372 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:07:39 np0005546909 nova_compute[187208]: 2025-12-05 12:07:39.028 187212 DEBUG nova.network.neutron [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updated VIF entry in instance network info cache for port d596fdf6-011f-43a4-bdb8-e76cc7302187. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:07:39 np0005546909 nova_compute[187208]: 2025-12-05 12:07:39.029 187212 DEBUG nova.network.neutron [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updating instance_info_cache with network_info: [{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:39 np0005546909 nova_compute[187208]: 2025-12-05 12:07:39.062 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:39 np0005546909 nova_compute[187208]: 2025-12-05 12:07:39.063 187212 DEBUG nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:39 np0005546909 nova_compute[187208]: 2025-12-05 12:07:39.063 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:39 np0005546909 nova_compute[187208]: 2025-12-05 12:07:39.063 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:39 np0005546909 nova_compute[187208]: 2025-12-05 12:07:39.064 187212 DEBUG oslo_concurrency.lockutils [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:39 np0005546909 nova_compute[187208]: 2025-12-05 12:07:39.064 187212 DEBUG nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:07:39 np0005546909 nova_compute[187208]: 2025-12-05 12:07:39.064 187212 WARNING nova.compute.manager [req-c5425208-80c6-4f7d-8923-64d91ec22d2d req-3ddff35e-8839-421c-a05f-23e26edac5dd 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:07:39 np0005546909 nova_compute[187208]: 2025-12-05 12:07:39.065 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquired lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:39 np0005546909 nova_compute[187208]: 2025-12-05 12:07:39.065 187212 DEBUG nova.network.neutron [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:07:40 np0005546909 nova_compute[187208]: 2025-12-05 12:07:40.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:07:40 np0005546909 nova_compute[187208]: 2025-12-05 12:07:40.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:07:40 np0005546909 nova_compute[187208]: 2025-12-05 12:07:40.440 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:40 np0005546909 nova_compute[187208]: 2025-12-05 12:07:40.551 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:40 np0005546909 nova_compute[187208]: 2025-12-05 12:07:40.551 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:40 np0005546909 nova_compute[187208]: 2025-12-05 12:07:40.551 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:07:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:41Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:20:58:3d 10.100.0.11
Dec  5 07:07:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:41Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:20:58:3d 10.100.0.11
Dec  5 07:07:41 np0005546909 nova_compute[187208]: 2025-12-05 12:07:41.104 187212 DEBUG nova.network.neutron [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Successfully updated port: 29e412e9-d3cc-4af2-b85a-ab48fcad0372 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:07:41 np0005546909 nova_compute[187208]: 2025-12-05 12:07:41.126 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:41 np0005546909 nova_compute[187208]: 2025-12-05 12:07:41.126 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquired lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:41 np0005546909 nova_compute[187208]: 2025-12-05 12:07:41.126 187212 DEBUG nova.network.neutron [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:07:41 np0005546909 nova_compute[187208]: 2025-12-05 12:07:41.225 187212 DEBUG nova.compute.manager [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Received event network-changed-29e412e9-d3cc-4af2-b85a-ab48fcad0372 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:41 np0005546909 nova_compute[187208]: 2025-12-05 12:07:41.226 187212 DEBUG nova.compute.manager [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Refreshing instance network info cache due to event network-changed-29e412e9-d3cc-4af2-b85a-ab48fcad0372. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:07:41 np0005546909 nova_compute[187208]: 2025-12-05 12:07:41.226 187212 DEBUG oslo_concurrency.lockutils [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:41 np0005546909 nova_compute[187208]: 2025-12-05 12:07:41.724 187212 DEBUG nova.network.neutron [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:07:41 np0005546909 nova_compute[187208]: 2025-12-05 12:07:41.821 187212 DEBUG nova.network.neutron [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updating instance_info_cache with network_info: [{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:41 np0005546909 nova_compute[187208]: 2025-12-05 12:07:41.840 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Releasing lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:41 np0005546909 nova_compute[187208]: 2025-12-05 12:07:41.841 187212 DEBUG nova.compute.manager [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:41 np0005546909 kernel: tapd596fdf6-01 (unregistering): left promiscuous mode
Dec  5 07:07:41 np0005546909 NetworkManager[55691]: <info>  [1764936461.9954] device (tapd596fdf6-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:07:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:42Z|00570|binding|INFO|Releasing lport d596fdf6-011f-43a4-bdb8-e76cc7302187 from this chassis (sb_readonly=0)
Dec  5 07:07:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:42Z|00571|binding|INFO|Setting lport d596fdf6-011f-43a4-bdb8-e76cc7302187 down in Southbound
Dec  5 07:07:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:42Z|00572|binding|INFO|Removing iface tapd596fdf6-01 ovn-installed in OVS
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.019 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:58:3d 10.100.0.11'], port_security=['fa:16:3e:20:58:3d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '39a36503-acd4-4199-89f3-2e714ef9e5c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd355bd0-560e-4b18-a504-3a5134c930f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '442a804e3368417d9de1636d533a25e0', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ca5c04f9-329e-4501-a23a-78c17c54e4de fbf9a881-7958-4974-8ace-72447edf35a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67381b26-6b90-4d98-928b-9358d69f9e0c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d596fdf6-011f-43a4-bdb8-e76cc7302187) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.020 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d596fdf6-011f-43a4-bdb8-e76cc7302187 in datapath dd355bd0-560e-4b18-a504-3a5134c930f4 unbound from our chassis#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.011 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.022 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd355bd0-560e-4b18-a504-3a5134c930f4#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.025 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.041 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d5bd40-0bad-47ca-84a5-53fc9b883c4e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:42 np0005546909 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000042.scope: Deactivated successfully.
Dec  5 07:07:42 np0005546909 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000042.scope: Consumed 13.008s CPU time.
Dec  5 07:07:42 np0005546909 systemd-machined[153543]: Machine qemu-71-instance-00000042 terminated.
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.078 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[28bc87fb-b749-4259-8d5a-b44ffaea15da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.081 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[65a89004-633d-4cb5-9d1b-abc0ac6a831e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.110 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4f997d39-6283-45e5-8043-556e70b52205]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.124 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[592d3d6f-3e64-4abc-8912-e3abccb5f1ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd355bd0-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:03:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376697, 'reachable_time': 32536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228764, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.138 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b520aede-01bf-4b69-be96-3cc823695a75]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376710, 'tstamp': 376710}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228765, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376713, 'tstamp': 376713}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228765, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.140 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd355bd0-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.142 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.148 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.149 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd355bd0-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.149 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.149 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd355bd0-50, col_values=(('external_ids', {'iface-id': 'd5a54702-8e08-4aa4-aef4-19a0cc66763a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.150 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.186 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.194 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.227 187212 INFO nova.virt.libvirt.driver [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance destroyed successfully.#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.228 187212 DEBUG nova.objects.instance [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'resources' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.244 187212 DEBUG nova.virt.libvirt.vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1919324581',display_name='tempest-SecurityGroupsTestJSON-server-1919324581',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1919324581',id=66,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-yf09e02y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:41Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=39a36503-acd4-4199-89f3-2e714ef9e5c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.245 187212 DEBUG nova.network.os_vif_util [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.246 187212 DEBUG nova.network.os_vif_util [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.246 187212 DEBUG os_vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.249 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.249 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd596fdf6-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.253 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.256 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.258 187212 INFO os_vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01')#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.265 187212 DEBUG nova.virt.libvirt.driver [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Start _get_guest_xml network_info=[{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.269 187212 WARNING nova.virt.libvirt.driver [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.277 187212 DEBUG nova.virt.libvirt.host [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.279 187212 DEBUG nova.virt.libvirt.host [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.284 187212 DEBUG nova.virt.libvirt.host [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.285 187212 DEBUG nova.virt.libvirt.host [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.285 187212 DEBUG nova.virt.libvirt.driver [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.285 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.285 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.286 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.286 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.286 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.286 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.286 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.286 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.287 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.287 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.287 187212 DEBUG nova.virt.hardware [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.287 187212 DEBUG nova.objects.instance [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.304 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.362 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.363 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.363 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.364 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.366 187212 DEBUG nova.virt.libvirt.vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1919324581',display_name='tempest-SecurityGroupsTestJSON-server-1919324581',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1919324581',id=66,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-yf09e02y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:41Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=39a36503-acd4-4199-89f3-2e714ef9e5c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.366 187212 DEBUG nova.network.os_vif_util [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.367 187212 DEBUG nova.network.os_vif_util [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.368 187212 DEBUG nova.objects.instance [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.387 187212 DEBUG nova.virt.libvirt.driver [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  <uuid>39a36503-acd4-4199-89f3-2e714ef9e5c5</uuid>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  <name>instance-00000042</name>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <nova:name>tempest-SecurityGroupsTestJSON-server-1919324581</nova:name>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:07:42</nova:creationTime>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:        <nova:user uuid="8db061f8c48141d1ac1c3216db1cc7f8">tempest-SecurityGroupsTestJSON-549628149-project-member</nova:user>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:        <nova:project uuid="442a804e3368417d9de1636d533a25e0">tempest-SecurityGroupsTestJSON-549628149</nova:project>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:        <nova:port uuid="d596fdf6-011f-43a4-bdb8-e76cc7302187">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <entry name="serial">39a36503-acd4-4199-89f3-2e714ef9e5c5</entry>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <entry name="uuid">39a36503-acd4-4199-89f3-2e714ef9e5c5</entry>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk.config"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:20:58:3d"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <target dev="tapd596fdf6-01"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/console.log" append="off"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <input type="keyboard" bus="usb"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:07:42 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:07:42 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:07:42 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:07:42 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.388 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.457 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.458 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.518 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.520 187212 DEBUG nova.objects.instance [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.534 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.597 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.598 187212 DEBUG nova.virt.disk.api [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Checking if we can resize image /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.599 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.663 187212 DEBUG oslo_concurrency.processutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.664 187212 DEBUG nova.virt.disk.api [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Cannot resize image /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.664 187212 DEBUG nova.objects.instance [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'migration_context' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.682 187212 DEBUG nova.virt.libvirt.vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1919324581',display_name='tempest-SecurityGroupsTestJSON-server-1919324581',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1919324581',id=66,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-yf09e02y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:41Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=39a36503-acd4-4199-89f3-2e714ef9e5c5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.683 187212 DEBUG nova.network.os_vif_util [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.684 187212 DEBUG nova.network.os_vif_util [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.685 187212 DEBUG os_vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.685 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.686 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.687 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.691 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd596fdf6-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.692 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd596fdf6-01, col_values=(('external_ids', {'iface-id': 'd596fdf6-011f-43a4-bdb8-e76cc7302187', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:58:3d', 'vm-uuid': '39a36503-acd4-4199-89f3-2e714ef9e5c5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:42 np0005546909 NetworkManager[55691]: <info>  [1764936462.7419] manager: (tapd596fdf6-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.742 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.746 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.747 187212 INFO os_vif [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01')#033[00m
Dec  5 07:07:42 np0005546909 kernel: tapd596fdf6-01: entered promiscuous mode
Dec  5 07:07:42 np0005546909 systemd-udevd[228755]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:07:42 np0005546909 NetworkManager[55691]: <info>  [1764936462.8246] manager: (tapd596fdf6-01): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.826 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:42Z|00573|binding|INFO|Claiming lport d596fdf6-011f-43a4-bdb8-e76cc7302187 for this chassis.
Dec  5 07:07:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:42Z|00574|binding|INFO|d596fdf6-011f-43a4-bdb8-e76cc7302187: Claiming fa:16:3e:20:58:3d 10.100.0.11
Dec  5 07:07:42 np0005546909 NetworkManager[55691]: <info>  [1764936462.8393] device (tapd596fdf6-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:07:42 np0005546909 NetworkManager[55691]: <info>  [1764936462.8434] device (tapd596fdf6-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.841 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:58:3d 10.100.0.11'], port_security=['fa:16:3e:20:58:3d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '39a36503-acd4-4199-89f3-2e714ef9e5c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd355bd0-560e-4b18-a504-3a5134c930f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '442a804e3368417d9de1636d533a25e0', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ca5c04f9-329e-4501-a23a-78c17c54e4de fbf9a881-7958-4974-8ace-72447edf35a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67381b26-6b90-4d98-928b-9358d69f9e0c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d596fdf6-011f-43a4-bdb8-e76cc7302187) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.844 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d596fdf6-011f-43a4-bdb8-e76cc7302187 in datapath dd355bd0-560e-4b18-a504-3a5134c930f4 bound to our chassis#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.848 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd355bd0-560e-4b18-a504-3a5134c930f4#033[00m
Dec  5 07:07:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:42Z|00575|binding|INFO|Setting lport d596fdf6-011f-43a4-bdb8-e76cc7302187 ovn-installed in OVS
Dec  5 07:07:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:42Z|00576|binding|INFO|Setting lport d596fdf6-011f-43a4-bdb8-e76cc7302187 up in Southbound
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.849 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.854 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:42Z|00577|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec  5 07:07:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:42Z|00578|binding|INFO|Releasing lport 1b193bb7-c39e-445c-9a2c-dd8ee58553b9 from this chassis (sb_readonly=0)
Dec  5 07:07:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:42Z|00579|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec  5 07:07:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:42Z|00580|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.864 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1d409f7f-1748-41e1-8e23-c24570f7fd79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:42 np0005546909 systemd-machined[153543]: New machine qemu-74-instance-00000042.
Dec  5 07:07:42 np0005546909 systemd[1]: Started Virtual Machine qemu-74-instance-00000042.
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.899 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d4bfe351-eb8b-4dda-888c-6fb10efc959b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.902 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e3de5e47-d2cb-4930-a3e4-a8261bc24802]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.931 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d62d6c-f5a3-478f-bed8-2d777286229d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.951 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fc4707a2-4008-4dab-8a20-d39f19c8967f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd355bd0-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:03:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 9, 'rx_bytes': 616, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376697, 'reachable_time': 32536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228824, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.977 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[52a8ba84-e2b1-429f-95f5-bbe2a57ccb19]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376710, 'tstamp': 376710}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228827, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376713, 'tstamp': 376713}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228827, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.979 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd355bd0-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.981 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 nova_compute[187208]: 2025-12-05 12:07:42.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.982 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd355bd0-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.983 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.983 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd355bd0-50, col_values=(('external_ids', {'iface-id': 'd5a54702-8e08-4aa4-aef4-19a0cc66763a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:42.984 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.043 187212 DEBUG nova.network.neutron [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Updating instance_info_cache with network_info: [{"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.070 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Releasing lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.070 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance network_info: |[{"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.071 187212 DEBUG oslo_concurrency.lockutils [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.071 187212 DEBUG nova.network.neutron [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Refreshing network info cache for port 29e412e9-d3cc-4af2-b85a-ab48fcad0372 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.075 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Start _get_guest_xml network_info=[{"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.081 187212 WARNING nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.093 187212 DEBUG nova.virt.libvirt.host [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.094 187212 DEBUG nova.virt.libvirt.host [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.105 187212 DEBUG nova.virt.libvirt.host [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.106 187212 DEBUG nova.virt.libvirt.host [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.106 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.107 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.107 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.107 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.108 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.108 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.108 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.108 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.109 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.109 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.109 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.109 187212 DEBUG nova.virt.hardware [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.114 187212 DEBUG nova.virt.libvirt.vif [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1539570170',display_name='tempest-ServerActionsTestOtherB-server-1539570170',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1539570170',id=68,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-bfsh2n18',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:36Z,user_data=None,user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=5659bd52-8c24-483d-80a4-8eb6b28e1349,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.114 187212 DEBUG nova.network.os_vif_util [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.115 187212 DEBUG nova.network.os_vif_util [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.116 187212 DEBUG nova.objects.instance [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5659bd52-8c24-483d-80a4-8eb6b28e1349 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.162 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  <uuid>5659bd52-8c24-483d-80a4-8eb6b28e1349</uuid>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  <name>instance-00000044</name>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerActionsTestOtherB-server-1539570170</nova:name>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:07:43</nova:creationTime>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:        <nova:user uuid="4ad1281afc874c0ca55d908d3a6e05a8">tempest-ServerActionsTestOtherB-1759520420-project-member</nova:user>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:        <nova:project uuid="58cbd93e463049988ccd6d013893e7d6">tempest-ServerActionsTestOtherB-1759520420</nova:project>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:        <nova:port uuid="29e412e9-d3cc-4af2-b85a-ab48fcad0372">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <entry name="serial">5659bd52-8c24-483d-80a4-8eb6b28e1349</entry>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <entry name="uuid">5659bd52-8c24-483d-80a4-8eb6b28e1349</entry>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.config"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:68:32:38"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <target dev="tap29e412e9-d3"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/console.log" append="off"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:07:43 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:07:43 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:07:43 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:07:43 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.163 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Preparing to wait for external event network-vif-plugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.163 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.163 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.163 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.164 187212 DEBUG nova.virt.libvirt.vif [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1539570170',display_name='tempest-ServerActionsTestOtherB-server-1539570170',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1539570170',id=68,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-bfsh2n18',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:36Z,user_data=None,user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=5659bd52-8c24-483d-80a4-8eb6b28e1349,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.164 187212 DEBUG nova.network.os_vif_util [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.165 187212 DEBUG nova.network.os_vif_util [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.165 187212 DEBUG os_vif [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.165 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.166 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.166 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.169 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.169 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29e412e9-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.170 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap29e412e9-d3, col_values=(('external_ids', {'iface-id': '29e412e9-d3cc-4af2-b85a-ab48fcad0372', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:68:32:38', 'vm-uuid': '5659bd52-8c24-483d-80a4-8eb6b28e1349'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:43 np0005546909 NetworkManager[55691]: <info>  [1764936463.1721] manager: (tap29e412e9-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/239)
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.173 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.177 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.178 187212 INFO os_vif [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3')#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.290 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 39a36503-acd4-4199-89f3-2e714ef9e5c5 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.290 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936463.2896805, 39a36503-acd4-4199-89f3-2e714ef9e5c5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.291 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.293 187212 DEBUG nova.compute.manager [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.295 187212 INFO nova.virt.libvirt.driver [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance rebooted successfully.#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.296 187212 DEBUG nova.compute.manager [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.520 187212 DEBUG nova.compute.manager [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-changed-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.520 187212 DEBUG nova.compute.manager [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing instance network info cache due to event network-changed-f7a6775e-6d9c-48e1-91d7-829a6f5f3742. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.521 187212 DEBUG oslo_concurrency.lockutils [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.521 187212 DEBUG oslo_concurrency.lockutils [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.521 187212 DEBUG nova.network.neutron [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing network info cache for port f7a6775e-6d9c-48e1-91d7-829a6f5f3742 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.574 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.578 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.602 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.602 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.602 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No VIF found with MAC fa:16:3e:68:32:38, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.603 187212 INFO nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Using config drive#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.609 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.610 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936463.2913365, 39a36503-acd4-4199-89f3-2e714ef9e5c5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.610 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] VM Started (Lifecycle Event)#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.623 187212 DEBUG oslo_concurrency.lockutils [None req-e2f0d5ac-091a-4d98-82ae-1c9a27c352cc 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 6.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.634 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:43 np0005546909 nova_compute[187208]: 2025-12-05 12:07:43.637 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:07:44 np0005546909 podman[228840]: 2025-12-05 12:07:44.210060978 +0000 UTC m=+0.060761214 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.687 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [{"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.783 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-5d70ac2d-111f-4e1b-ac26-3e02849b0458" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.784 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.784 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.785 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.785 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.785 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.785 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.786 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.786 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.786 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.880 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.881 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.881 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.881 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:07:44 np0005546909 nova_compute[187208]: 2025-12-05 12:07:44.985 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.122 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.124 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.175 187212 INFO nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Creating config drive at /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.config#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.181 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprkht8fwj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.202 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.209 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.277 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.278 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.313 187212 DEBUG oslo_concurrency.processutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprkht8fwj" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.350 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.359 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:45 np0005546909 NetworkManager[55691]: <info>  [1764936465.3792] manager: (tap29e412e9-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/240)
Dec  5 07:07:45 np0005546909 kernel: tap29e412e9-d3: entered promiscuous mode
Dec  5 07:07:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:45Z|00581|binding|INFO|Claiming lport 29e412e9-d3cc-4af2-b85a-ab48fcad0372 for this chassis.
Dec  5 07:07:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:45Z|00582|binding|INFO|29e412e9-d3cc-4af2-b85a-ab48fcad0372: Claiming fa:16:3e:68:32:38 10.100.0.6
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.385 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:45Z|00583|binding|INFO|Setting lport 29e412e9-d3cc-4af2-b85a-ab48fcad0372 ovn-installed in OVS
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.403 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.406 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:45 np0005546909 systemd-udevd[228896]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:07:45 np0005546909 systemd-machined[153543]: New machine qemu-75-instance-00000044.
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.437 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.438 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:45 np0005546909 systemd[1]: Started Virtual Machine qemu-75-instance-00000044.
Dec  5 07:07:45 np0005546909 NetworkManager[55691]: <info>  [1764936465.4475] device (tap29e412e9-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:07:45 np0005546909 NetworkManager[55691]: <info>  [1764936465.4484] device (tap29e412e9-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.512 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.533 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.588 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:32:38 10.100.0.6'], port_security=['fa:16:3e:68:32:38 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5659bd52-8c24-483d-80a4-8eb6b28e1349', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'df1c03c3-b3c9-47b6-a712-a13948dd510e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=29e412e9-d3cc-4af2-b85a-ab48fcad0372) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.590 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 29e412e9-d3cc-4af2-b85a-ab48fcad0372 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 bound to our chassis#033[00m
Dec  5 07:07:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:45Z|00584|binding|INFO|Setting lport 29e412e9-d3cc-4af2-b85a-ab48fcad0372 up in Southbound
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.592 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.608 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7038ee-9865-423e-ba3e-4d2d354e0e92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.608 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.609 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.635 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d116ef78-0b7b-4f66-8451-9a5896ff7ca3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.639 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6f73cc-1a61-4e90-a0db-9ec209715629]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.666 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.667 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1e022c-44f7-469b-8275-4b8ceacae915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.682 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.685 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8c20b6f1-0456-4d8d-b26b-3b6a2fa60ab8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 22378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228921, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.704 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4f63db98-e7a4-40df-9dbb-0058e4b9d3e9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228922, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228922, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.706 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.716 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.716 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.716 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:45.717 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.717 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.779 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.780 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.846 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.856 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.937 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:45 np0005546909 nova_compute[187208]: 2025-12-05 12:07:45.938 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.009 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.016 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.064 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936466.0631733, 5659bd52-8c24-483d-80a4-8eb6b28e1349 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.065 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] VM Started (Lifecycle Event)#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.090 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.090 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.154 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.162 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936466.0634668, 5659bd52-8c24-483d-80a4-8eb6b28e1349 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.162 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.170 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.378 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.383 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.443 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.444 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4805MB free_disk=73.0350456237793GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.444 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.445 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.450 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.484 187212 DEBUG nova.network.neutron [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Updated VIF entry in instance network info cache for port 29e412e9-d3cc-4af2-b85a-ab48fcad0372. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.484 187212 DEBUG nova.network.neutron [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Updating instance_info_cache with network_info: [{"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:46.622 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:46 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:46.622 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.627 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.653 187212 DEBUG oslo_concurrency.lockutils [req-eeaf68d5-4a95-4416-9564-025e5206ba78 req-8e8c951a-d6b1-480f-ad36-9a4790bf7194 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.684 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 24358eea-14fb-4863-a6c4-aadcdb495f54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.684 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance e9f9bf08-7688-4213-91ff-74f2271ec71d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.684 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 854e3893-3908-4b4a-b29c-7fb4384e4f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.684 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 5d70ac2d-111f-4e1b-ac26-3e02849b0458 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.685 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 39a36503-acd4-4199-89f3-2e714ef9e5c5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.685 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance f1e72d05-87e7-495d-9dbb-1a10b112c69f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.685 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 5659bd52-8c24-483d-80a4-8eb6b28e1349 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.685 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 7 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.685 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1408MB phys_disk=79GB used_disk=7GB total_vcpus=8 used_vcpus=7 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.863 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.880 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.889 187212 DEBUG nova.compute.manager [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received event network-changed-d596fdf6-011f-43a4-bdb8-e76cc7302187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.890 187212 DEBUG nova.compute.manager [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Refreshing instance network info cache due to event network-changed-d596fdf6-011f-43a4-bdb8-e76cc7302187. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.890 187212 DEBUG oslo_concurrency.lockutils [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.890 187212 DEBUG oslo_concurrency.lockutils [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.890 187212 DEBUG nova.network.neutron [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Refreshing network info cache for port d596fdf6-011f-43a4-bdb8-e76cc7302187 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.922 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:07:46 np0005546909 nova_compute[187208]: 2025-12-05 12:07:46.922 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:47 np0005546909 nova_compute[187208]: 2025-12-05 12:07:47.056 187212 DEBUG nova.network.neutron [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updated VIF entry in instance network info cache for port f7a6775e-6d9c-48e1-91d7-829a6f5f3742. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:07:47 np0005546909 nova_compute[187208]: 2025-12-05 12:07:47.056 187212 DEBUG nova.network.neutron [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:47Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:c5:99 10.100.0.8
Dec  5 07:07:47 np0005546909 nova_compute[187208]: 2025-12-05 12:07:47.211 187212 DEBUG oslo_concurrency.lockutils [req-a44e40c9-bf30-4d47-83d9-4830e9198768 req-d4c806f9-41bc-4c0c-a6fa-739720ebabb6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:47 np0005546909 nova_compute[187208]: 2025-12-05 12:07:47.423 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:47 np0005546909 nova_compute[187208]: 2025-12-05 12:07:47.855 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:48 np0005546909 nova_compute[187208]: 2025-12-05 12:07:48.171 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:48Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:01:99:b0 10.100.0.7
Dec  5 07:07:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:48Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:01:99:b0 10.100.0.7
Dec  5 07:07:48 np0005546909 nova_compute[187208]: 2025-12-05 12:07:48.918 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:07:48 np0005546909 nova_compute[187208]: 2025-12-05 12:07:48.919 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.040 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.041 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.041 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.041 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.041 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.042 187212 INFO nova.compute.manager [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Terminating instance#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.043 187212 DEBUG nova.compute.manager [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:07:50 np0005546909 kernel: tapd596fdf6-01 (unregistering): left promiscuous mode
Dec  5 07:07:50 np0005546909 NetworkManager[55691]: <info>  [1764936470.0608] device (tapd596fdf6-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.068 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:50 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:50Z|00585|binding|INFO|Releasing lport d596fdf6-011f-43a4-bdb8-e76cc7302187 from this chassis (sb_readonly=0)
Dec  5 07:07:50 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:50Z|00586|binding|INFO|Setting lport d596fdf6-011f-43a4-bdb8-e76cc7302187 down in Southbound
Dec  5 07:07:50 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:50Z|00587|binding|INFO|Removing iface tapd596fdf6-01 ovn-installed in OVS
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.080 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.091 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.098 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:58:3d 10.100.0.11'], port_security=['fa:16:3e:20:58:3d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '39a36503-acd4-4199-89f3-2e714ef9e5c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd355bd0-560e-4b18-a504-3a5134c930f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '442a804e3368417d9de1636d533a25e0', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'af894ac8-cd98-4c47-9a74-1921c6ddcff3 ca5c04f9-329e-4501-a23a-78c17c54e4de fbf9a881-7958-4974-8ace-72447edf35a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67381b26-6b90-4d98-928b-9358d69f9e0c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d596fdf6-011f-43a4-bdb8-e76cc7302187) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.100 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d596fdf6-011f-43a4-bdb8-e76cc7302187 in datapath dd355bd0-560e-4b18-a504-3a5134c930f4 unbound from our chassis#033[00m
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.102 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd355bd0-560e-4b18-a504-3a5134c930f4#033[00m
Dec  5 07:07:50 np0005546909 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000042.scope: Deactivated successfully.
Dec  5 07:07:50 np0005546909 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000042.scope: Consumed 7.339s CPU time.
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.118 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b7128b-5959-462a-beb2-32990b465c14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:50 np0005546909 systemd-machined[153543]: Machine qemu-74-instance-00000042 terminated.
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.144 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8e393fd9-10bb-4b1b-bcca-654faef0aaf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.147 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ac807f-163e-4f19-91ab-26225337a4d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.175 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[72d6b17e-5a6b-443a-8b71-dedaf3db3c01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.192 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fda1c4d3-36ea-4dd4-8caa-d5089449f25f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd355bd0-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:03:ad'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 11, 'rx_bytes': 616, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376697, 'reachable_time': 32536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228988, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.208 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd80af5-6812-46ae-9ea9-6ba94e8e5cce]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376710, 'tstamp': 376710}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228989, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdd355bd0-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376713, 'tstamp': 376713}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228989, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.210 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd355bd0-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.211 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.215 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.216 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd355bd0-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.216 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.216 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd355bd0-50, col_values=(('external_ids', {'iface-id': 'd5a54702-8e08-4aa4-aef4-19a0cc66763a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:50.217 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.264 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.270 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.307 187212 INFO nova.virt.libvirt.driver [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Instance destroyed successfully.#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.309 187212 DEBUG nova.objects.instance [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'resources' on Instance uuid 39a36503-acd4-4199-89f3-2e714ef9e5c5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.328 187212 DEBUG nova.virt.libvirt.vif [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1919324581',display_name='tempest-SecurityGroupsTestJSON-server-1919324581',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1919324581',id=66,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-yf09e02y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:43Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=39a36503-acd4-4199-89f3-2e714ef9e5c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.328 187212 DEBUG nova.network.os_vif_util [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.329 187212 DEBUG nova.network.os_vif_util [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.330 187212 DEBUG os_vif [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.333 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.333 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd596fdf6-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.335 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.337 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.338 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.341 187212 INFO os_vif [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:58:3d,bridge_name='br-int',has_traffic_filtering=True,id=d596fdf6-011f-43a4-bdb8-e76cc7302187,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd596fdf6-01')#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.342 187212 INFO nova.virt.libvirt.driver [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Deleting instance files /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5_del#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.342 187212 INFO nova.virt.libvirt.driver [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Deletion of /var/lib/nova/instances/39a36503-acd4-4199-89f3-2e714ef9e5c5_del complete#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.402 187212 INFO nova.compute.manager [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.403 187212 DEBUG oslo.service.loopingcall [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.403 187212 DEBUG nova.compute.manager [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.403 187212 DEBUG nova.network.neutron [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.450 187212 DEBUG nova.network.neutron [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updated VIF entry in instance network info cache for port d596fdf6-011f-43a4-bdb8-e76cc7302187. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.450 187212 DEBUG nova.network.neutron [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updating instance_info_cache with network_info: [{"id": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "address": "fa:16:3e:20:58:3d", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd596fdf6-01", "ovs_interfaceid": "d596fdf6-011f-43a4-bdb8-e76cc7302187", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:50 np0005546909 nova_compute[187208]: 2025-12-05 12:07:50.480 187212 DEBUG oslo_concurrency.lockutils [req-82d9291f-5382-4915-97e5-e9de11b878d7 req-47ebaf2c-2de3-48f4-9e90-835f3fb54127 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-39a36503-acd4-4199-89f3-2e714ef9e5c5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:07:52 np0005546909 nova_compute[187208]: 2025-12-05 12:07:52.031 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:52 np0005546909 podman[229007]: 2025-12-05 12:07:52.224598455 +0000 UTC m=+0.063975907 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm)
Dec  5 07:07:52 np0005546909 nova_compute[187208]: 2025-12-05 12:07:52.547 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:52 np0005546909 nova_compute[187208]: 2025-12-05 12:07:52.905 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:52 np0005546909 nova_compute[187208]: 2025-12-05 12:07:52.998 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.201 187212 DEBUG nova.network.neutron [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.218 187212 INFO nova.compute.manager [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Took 2.81 seconds to deallocate network for instance.#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.281 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.281 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.406 187212 DEBUG nova.compute.manager [req-5cc4fc36-9d4a-4daa-aeb4-94654b1b00a1 req-57451ebf-db91-4dd7-9480-89a0975d2215 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Received event network-vif-deleted-d596fdf6-011f-43a4-bdb8-e76cc7302187 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.528 187212 DEBUG nova.compute.provider_tree [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.554 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.554 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.556 187212 DEBUG nova.scheduler.client.report [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.605 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.608 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.647 187212 INFO nova.scheduler.client.report [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Deleted allocations for instance 39a36503-acd4-4199-89f3-2e714ef9e5c5#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.719 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.720 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.725 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.726 187212 INFO nova.compute.claims [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.740 187212 DEBUG oslo_concurrency.lockutils [None req-fd6ea7ab-c64e-455e-b281-3a4131d9a8f8 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "39a36503-acd4-4199-89f3-2e714ef9e5c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:53 np0005546909 nova_compute[187208]: 2025-12-05 12:07:53.994 187212 DEBUG nova.compute.provider_tree [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.023 187212 DEBUG nova.scheduler.client.report [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.051 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.051 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.109 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.110 187212 DEBUG nova.network.neutron [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.134 187212 INFO nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.171 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.300 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.303 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.304 187212 INFO nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Creating image(s)#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.305 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "/var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.305 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "/var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.306 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "/var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.319 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.382 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.383 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.383 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.395 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.457 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.459 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.499 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.501 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.501 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.568 187212 DEBUG nova.policy [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '21ddc7a76417447daa2a5a26cdf17d53', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'feb2d7c8b49945a08355fc4f902f2786', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.572 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.573 187212 DEBUG nova.virt.disk.api [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Checking if we can resize image /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.573 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.635 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.636 187212 DEBUG nova.virt.disk.api [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Cannot resize image /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.637 187212 DEBUG nova.objects.instance [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lazy-loading 'migration_context' on Instance uuid 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.664 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.665 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Ensure instance console log exists: /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.665 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.666 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:54 np0005546909 nova_compute[187208]: 2025-12-05 12:07:54.666 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:55 np0005546909 nova_compute[187208]: 2025-12-05 12:07:55.337 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:55.624 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.711 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.711 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.711 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.712 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.712 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.713 187212 INFO nova.compute.manager [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Terminating instance#033[00m
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.715 187212 DEBUG nova.compute.manager [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:07:56 np0005546909 kernel: tapac02dd63-5a (unregistering): left promiscuous mode
Dec  5 07:07:56 np0005546909 NetworkManager[55691]: <info>  [1764936476.7444] device (tapac02dd63-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.752 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:56Z|00588|binding|INFO|Releasing lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b from this chassis (sb_readonly=0)
Dec  5 07:07:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:56Z|00589|binding|INFO|Setting lport ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b down in Southbound
Dec  5 07:07:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:07:56Z|00590|binding|INFO|Removing iface tapac02dd63-5a ovn-installed in OVS
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.756 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:56.760 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:c5:99 10.100.0.8'], port_security=['fa:16:3e:6a:c5:99 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5d70ac2d-111f-4e1b-ac26-3e02849b0458', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d62df5807554f499d26b5fc77ec8603', 'neutron:revision_number': '9', 'neutron:security_group_ids': '5a04f4af-e81b-4661-95ed-5737ffc98cae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a7d298f-265e-44c5-a73a-18dd9ed0b171, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:07:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:56.762 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b in datapath fc6ce614-d0f7-413f-bc3e-26f7271993d9 unbound from our chassis#033[00m
Dec  5 07:07:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:56.764 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc6ce614-d0f7-413f-bc3e-26f7271993d9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:07:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:56.765 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d6427e64-d7a6-4166-b715-21e4c5283020]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:56.765 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 namespace which is not needed anymore#033[00m
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.782 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:56 np0005546909 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Dec  5 07:07:56 np0005546909 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000003e.scope: Consumed 13.871s CPU time.
Dec  5 07:07:56 np0005546909 systemd-machined[153543]: Machine qemu-72-instance-0000003e terminated.
Dec  5 07:07:56 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [NOTICE]   (228491) : haproxy version is 2.8.14-c23fe91
Dec  5 07:07:56 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [NOTICE]   (228491) : path to executable is /usr/sbin/haproxy
Dec  5 07:07:56 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [WARNING]  (228491) : Exiting Master process...
Dec  5 07:07:56 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [ALERT]    (228491) : Current worker (228493) exited with code 143 (Terminated)
Dec  5 07:07:56 np0005546909 neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9[228487]: [WARNING]  (228491) : All workers exited. Exiting... (0)
Dec  5 07:07:56 np0005546909 systemd[1]: libpod-770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f.scope: Deactivated successfully.
Dec  5 07:07:56 np0005546909 podman[229063]: 2025-12-05 12:07:56.903234872 +0000 UTC m=+0.045119500 container died 770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  5 07:07:56 np0005546909 systemd[1]: var-lib-containers-storage-overlay-609c1c748c26ab3742ccbcbaed3a0fb9e3b7ac74e56bc02b438dfce85dc57371-merged.mount: Deactivated successfully.
Dec  5 07:07:56 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f-userdata-shm.mount: Deactivated successfully.
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.935 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.940 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:56 np0005546909 podman[229063]: 2025-12-05 12:07:56.942285045 +0000 UTC m=+0.084169673 container cleanup 770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:07:56 np0005546909 systemd[1]: libpod-conmon-770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f.scope: Deactivated successfully.
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.982 187212 INFO nova.virt.libvirt.driver [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Instance destroyed successfully.#033[00m
Dec  5 07:07:56 np0005546909 nova_compute[187208]: 2025-12-05 12:07:56.982 187212 DEBUG nova.objects.instance [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lazy-loading 'resources' on Instance uuid 5d70ac2d-111f-4e1b-ac26-3e02849b0458 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:07:57 np0005546909 podman[229104]: 2025-12-05 12:07:57.008701333 +0000 UTC m=+0.044357308 container remove 770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:07:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.013 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[787d7c9a-baf1-4ea8-bb7c-466918771aa3]: (4, ('Fri Dec  5 12:07:56 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 (770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f)\n770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f\nFri Dec  5 12:07:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 (770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f)\n770b472cd482b419d1f4d9f4269fe4f1a540317daea03b517bbce83a8ff7ca5f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.015 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[750700a1-6de8-4439-b210-a09396e1c933]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.016 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc6ce614-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.018 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:57 np0005546909 kernel: tapfc6ce614-d0: left promiscuous mode
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.033 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.034 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.038 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b5f4495e-d718-4412-b0b3-52d64bbbaaa2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.054 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5c465009-88ac-4c68-900d-bff315dc0917]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.055 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[230b68ef-1662-4dd2-9599-25c80f50a827]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.071 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[269156e2-a2eb-4bc4-b8f8-aaf8b6f31eab]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382975, 'reachable_time': 40482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229128, 'error': None, 'target': 'ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.073 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fc6ce614-d0f7-413f-bc3e-26f7271993d9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:07:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:07:57.074 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce07fff-8778-4b2f-9519-6c449419f01c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:07:57 np0005546909 systemd[1]: run-netns-ovnmeta\x2dfc6ce614\x2dd0f7\x2d413f\x2dbc3e\x2d26f7271993d9.mount: Deactivated successfully.
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.447 187212 DEBUG nova.virt.libvirt.vif [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:06:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-795100487',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-795100487',id=62,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHKhL003clvQeWhyQnRnlaccZLUvEBLEhvImBOCB5geqDizgWJsGjayma/8q9qGL/NiGPTPxEZoxanWZnFRBuZklxJy5hDaSwVjbF4FtdnX9ysLeFgNsQAX0H4LK24ei2Q==',key_name='tempest-keypair-105541899',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d62df5807554f499d26b5fc77ec8603',ramdisk_id='',reservation_id='r-zgvbze4r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-1858452545',owner_user_name='tempest-AttachVolumeShelveTestJSON-1858452545-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='bc4332be3b424a5e996b61b244505cfc',uuid=5d70ac2d-111f-4e1b-ac26-3e02849b0458,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.448 187212 DEBUG nova.network.os_vif_util [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converting VIF {"id": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "address": "fa:16:3e:6a:c5:99", "network": {"id": "fc6ce614-d0f7-413f-bc3e-26f7271993d9", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-756676969-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d62df5807554f499d26b5fc77ec8603", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac02dd63-5a", "ovs_interfaceid": "ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.448 187212 DEBUG nova.network.os_vif_util [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.449 187212 DEBUG os_vif [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.450 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.451 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac02dd63-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.452 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.453 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.455 187212 INFO os_vif [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:c5:99,bridge_name='br-int',has_traffic_filtering=True,id=ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b,network=Network(fc6ce614-d0f7-413f-bc3e-26f7271993d9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac02dd63-5a')#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.456 187212 INFO nova.virt.libvirt.driver [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Deleting instance files /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458_del#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.462 187212 INFO nova.virt.libvirt.driver [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Deletion of /var/lib/nova/instances/5d70ac2d-111f-4e1b-ac26-3e02849b0458_del complete#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.608 187212 INFO nova.compute.manager [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Took 0.89 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.609 187212 DEBUG oslo.service.loopingcall [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.609 187212 DEBUG nova.compute.manager [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.610 187212 DEBUG nova.network.neutron [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.828 187212 DEBUG nova.network.neutron [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Successfully created port: b66066cc-97eb-4896-a98d-267498dedf74 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:07:57 np0005546909 nova_compute[187208]: 2025-12-05 12:07:57.908 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:07:58 np0005546909 nova_compute[187208]: 2025-12-05 12:07:58.896 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:58 np0005546909 nova_compute[187208]: 2025-12-05 12:07:58.897 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.019 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.136 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.137 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.147 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.147 187212 INFO nova.compute.claims [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:07:59 np0005546909 podman[229130]: 2025-12-05 12:07:59.221634499 +0000 UTC m=+0.067818709 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 07:07:59 np0005546909 podman[229129]: 2025-12-05 12:07:59.239873388 +0000 UTC m=+0.080534568 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, io.openshift.tags=minimal rhel9)
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.465 187212 DEBUG nova.compute.provider_tree [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.566 187212 DEBUG nova.scheduler.client.report [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.593 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.456s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.594 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.705 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.706 187212 DEBUG nova.network.neutron [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.752 187212 INFO nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.773 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.885 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.886 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.887 187212 INFO nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Creating image(s)#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.888 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.888 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.889 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.904 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.972 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.973 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.974 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:07:59 np0005546909 nova_compute[187208]: 2025-12-05 12:07:59.987 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.079 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.081 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.120 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.121 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.122 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.190 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.191 187212 DEBUG nova.virt.disk.api [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Checking if we can resize image /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.191 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.253 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.254 187212 DEBUG nova.virt.disk.api [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Cannot resize image /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.255 187212 DEBUG nova.objects.instance [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'migration_context' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.340 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.341 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Ensure instance console log exists: /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.342 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.342 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.343 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:00 np0005546909 nova_compute[187208]: 2025-12-05 12:08:00.584 187212 DEBUG nova.policy [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:08:01 np0005546909 nova_compute[187208]: 2025-12-05 12:08:01.589 187212 DEBUG nova.network.neutron [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:01 np0005546909 nova_compute[187208]: 2025-12-05 12:08:01.610 187212 INFO nova.compute.manager [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Took 4.00 seconds to deallocate network for instance.#033[00m
Dec  5 07:08:01 np0005546909 nova_compute[187208]: 2025-12-05 12:08:01.666 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:01 np0005546909 nova_compute[187208]: 2025-12-05 12:08:01.666 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:01 np0005546909 nova_compute[187208]: 2025-12-05 12:08:01.880 187212 DEBUG nova.compute.provider_tree [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:08:01 np0005546909 nova_compute[187208]: 2025-12-05 12:08:01.903 187212 DEBUG nova.scheduler.client.report [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:08:01 np0005546909 nova_compute[187208]: 2025-12-05 12:08:01.954 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:01 np0005546909 nova_compute[187208]: 2025-12-05 12:08:01.982 187212 INFO nova.scheduler.client.report [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Deleted allocations for instance 5d70ac2d-111f-4e1b-ac26-3e02849b0458#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.044 187212 DEBUG oslo_concurrency.lockutils [None req-7a7d1f22-e47a-4213-8cea-2590bb2ccadc bc4332be3b424a5e996b61b244505cfc 6d62df5807554f499d26b5fc77ec8603 - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.453 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.725 187212 DEBUG nova.compute.manager [req-92e7d2e2-ca41-423e-8554-7ea308e70d55 req-bed92bef-567f-431f-8199-5f2e11e3eddb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Received event network-vif-plugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.725 187212 DEBUG oslo_concurrency.lockutils [req-92e7d2e2-ca41-423e-8554-7ea308e70d55 req-bed92bef-567f-431f-8199-5f2e11e3eddb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.726 187212 DEBUG oslo_concurrency.lockutils [req-92e7d2e2-ca41-423e-8554-7ea308e70d55 req-bed92bef-567f-431f-8199-5f2e11e3eddb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.726 187212 DEBUG oslo_concurrency.lockutils [req-92e7d2e2-ca41-423e-8554-7ea308e70d55 req-bed92bef-567f-431f-8199-5f2e11e3eddb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.726 187212 DEBUG nova.compute.manager [req-92e7d2e2-ca41-423e-8554-7ea308e70d55 req-bed92bef-567f-431f-8199-5f2e11e3eddb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Processing event network-vif-plugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.727 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance event wait completed in 16 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.731 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936482.7313843, 5659bd52-8c24-483d-80a4-8eb6b28e1349 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.732 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.736 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.741 187212 INFO nova.virt.libvirt.driver [-] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance spawned successfully.#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.742 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.758 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.766 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.771 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.772 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.772 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.773 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.773 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.773 187212 DEBUG nova.virt.libvirt.driver [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.797 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.812 187212 DEBUG nova.network.neutron [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Successfully updated port: b66066cc-97eb-4896-a98d-267498dedf74 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.839 187212 INFO nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Took 25.94 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.839 187212 DEBUG nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.840 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "refresh_cache-3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.840 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquired lock "refresh_cache-3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.840 187212 DEBUG nova.network.neutron [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.891 187212 DEBUG nova.compute.manager [req-0d9dddb7-eb24-43b4-93c9-b78e96c8b927 req-86e7b140-5813-423f-a933-72747bc196fa 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-deleted-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.905 187212 INFO nova.compute.manager [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Took 26.73 seconds to build instance.#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.911 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:02 np0005546909 nova_compute[187208]: 2025-12-05 12:08:02.923 187212 DEBUG oslo_concurrency.lockutils [None req-9c2206d1-052d-4789-ab40-647ab030b692 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 26.845s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:03.014 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:03.015 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:03.016 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:04 np0005546909 nova_compute[187208]: 2025-12-05 12:08:04.116 187212 DEBUG nova.network.neutron [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:08:05 np0005546909 podman[229187]: 2025-12-05 12:08:05.223548315 +0000 UTC m=+0.076440289 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 07:08:05 np0005546909 podman[229188]: 2025-12-05 12:08:05.232874276 +0000 UTC m=+0.082662110 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:08:05 np0005546909 nova_compute[187208]: 2025-12-05 12:08:05.304 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936470.3032813, 39a36503-acd4-4199-89f3-2e714ef9e5c5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:05 np0005546909 nova_compute[187208]: 2025-12-05 12:08:05.305 187212 INFO nova.compute.manager [-] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:08:05 np0005546909 nova_compute[187208]: 2025-12-05 12:08:05.627 187212 DEBUG nova.compute.manager [None req-30ab2256-17eb-492e-958c-9ff045abc5f6 - - - - - -] [instance: 39a36503-acd4-4199-89f3-2e714ef9e5c5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:06 np0005546909 nova_compute[187208]: 2025-12-05 12:08:06.057 187212 DEBUG nova.network.neutron [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Successfully created port: 11c7fa90-6a48-487a-a375-5adf7f41cb90 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.197 187212 DEBUG nova.network.neutron [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Updating instance_info_cache with network_info: [{"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.218 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Releasing lock "refresh_cache-3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.219 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Instance network_info: |[{"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.222 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Start _get_guest_xml network_info=[{"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.226 187212 WARNING nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.235 187212 DEBUG nova.virt.libvirt.host [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.236 187212 DEBUG nova.virt.libvirt.host [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.241 187212 DEBUG nova.virt.libvirt.host [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.242 187212 DEBUG nova.virt.libvirt.host [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.242 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.242 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.266 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.267 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.268 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.268 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.269 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.269 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.270 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.270 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.271 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.271 187212 DEBUG nova.virt.hardware [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.284 187212 DEBUG nova.virt.libvirt.vif [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1497746963',display_name='tempest-ServerAddressesNegativeTestJSON-server-1497746963',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1497746963',id=69,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='feb2d7c8b49945a08355fc4f902f2786',ramdisk_id='',reservation_id='r-wvd553zf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-717599576',owner_user_name='tempest-ServerAddressesNegativeTestJSON-717599576-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:54Z,user_data=None,user_id='21ddc7a76417447daa2a5a26cdf17d53',uuid=3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.285 187212 DEBUG nova.network.os_vif_util [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Converting VIF {"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.286 187212 DEBUG nova.network.os_vif_util [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.287 187212 DEBUG nova.objects.instance [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.303 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  <uuid>3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf</uuid>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  <name>instance-00000045</name>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerAddressesNegativeTestJSON-server-1497746963</nova:name>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:08:07</nova:creationTime>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:        <nova:user uuid="21ddc7a76417447daa2a5a26cdf17d53">tempest-ServerAddressesNegativeTestJSON-717599576-project-member</nova:user>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:        <nova:project uuid="feb2d7c8b49945a08355fc4f902f2786">tempest-ServerAddressesNegativeTestJSON-717599576</nova:project>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:        <nova:port uuid="b66066cc-97eb-4896-a98d-267498dedf74">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <entry name="serial">3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf</entry>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <entry name="uuid">3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf</entry>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.config"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:b2:b8:fe"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <target dev="tapb66066cc-97"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/console.log" append="off"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:08:07 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:08:07 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:08:07 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:08:07 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.303 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Preparing to wait for external event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.304 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.304 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.304 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.305 187212 DEBUG nova.virt.libvirt.vif [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1497746963',display_name='tempest-ServerAddressesNegativeTestJSON-server-1497746963',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1497746963',id=69,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='feb2d7c8b49945a08355fc4f902f2786',ramdisk_id='',reservation_id='r-wvd553zf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-717599576',owner_user_name='tempest-ServerAddressesNegativeTestJSON-717599576-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:54Z,user_data=None,user_id='21ddc7a76417447daa2a5a26cdf17d53',uuid=3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.305 187212 DEBUG nova.network.os_vif_util [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Converting VIF {"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.306 187212 DEBUG nova.network.os_vif_util [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.306 187212 DEBUG os_vif [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.307 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.307 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.308 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.311 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.312 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb66066cc-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.312 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb66066cc-97, col_values=(('external_ids', {'iface-id': 'b66066cc-97eb-4896-a98d-267498dedf74', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:b8:fe', 'vm-uuid': '3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.314 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:07 np0005546909 NetworkManager[55691]: <info>  [1764936487.3150] manager: (tapb66066cc-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/241)
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.317 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.318 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.319 187212 INFO os_vif [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97')#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.384 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.385 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.385 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] No VIF found with MAC fa:16:3e:b2:b8:fe, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.386 187212 INFO nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Using config drive#033[00m
Dec  5 07:08:07 np0005546909 nova_compute[187208]: 2025-12-05 12:08:07.912 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.110 187212 INFO nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Creating config drive at /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.config#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.114 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxmh473oo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.191 187212 DEBUG nova.compute.manager [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received event network-changed-b66066cc-97eb-4896-a98d-267498dedf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.192 187212 DEBUG nova.compute.manager [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Refreshing instance network info cache due to event network-changed-b66066cc-97eb-4896-a98d-267498dedf74. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.192 187212 DEBUG oslo_concurrency.lockutils [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.192 187212 DEBUG oslo_concurrency.lockutils [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.192 187212 DEBUG nova.network.neutron [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Refreshing network info cache for port b66066cc-97eb-4896-a98d-267498dedf74 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.197 187212 DEBUG nova.compute.manager [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Received event network-vif-plugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.198 187212 DEBUG oslo_concurrency.lockutils [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.198 187212 DEBUG oslo_concurrency.lockutils [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.198 187212 DEBUG oslo_concurrency.lockutils [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.198 187212 DEBUG nova.compute.manager [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] No waiting events found dispatching network-vif-plugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.199 187212 WARNING nova.compute.manager [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Received unexpected event network-vif-plugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.199 187212 DEBUG nova.compute.manager [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.199 187212 DEBUG oslo_concurrency.lockutils [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.199 187212 DEBUG oslo_concurrency.lockutils [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.200 187212 DEBUG oslo_concurrency.lockutils [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5d70ac2d-111f-4e1b-ac26-3e02849b0458-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.200 187212 DEBUG nova.compute.manager [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] No waiting events found dispatching network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.200 187212 WARNING nova.compute.manager [req-425e5c81-6ad8-4d80-9b23-d559c15b0249 req-1a42813a-7abb-445f-80f9-705d663b81d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Received unexpected event network-vif-plugged-ac02dd63-5a73-4f80-b0ac-b8abd6ab2b9b for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.242 187212 DEBUG oslo_concurrency.processutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxmh473oo" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.292 187212 DEBUG oslo_concurrency.lockutils [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.293 187212 DEBUG oslo_concurrency.lockutils [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.293 187212 DEBUG nova.objects.instance [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:08 np0005546909 kernel: tapb66066cc-97: entered promiscuous mode
Dec  5 07:08:08 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:08Z|00591|binding|INFO|Claiming lport b66066cc-97eb-4896-a98d-267498dedf74 for this chassis.
Dec  5 07:08:08 np0005546909 NetworkManager[55691]: <info>  [1764936488.3295] manager: (tapb66066cc-97): new Tun device (/org/freedesktop/NetworkManager/Devices/242)
Dec  5 07:08:08 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:08Z|00592|binding|INFO|b66066cc-97eb-4896-a98d-267498dedf74: Claiming fa:16:3e:b2:b8:fe 10.100.0.8
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.328 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.335 187212 DEBUG nova.objects.instance [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_requests' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.337 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:b8:fe 10.100.0.8'], port_security=['fa:16:3e:b2:b8:fe 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'feb2d7c8b49945a08355fc4f902f2786', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fdf3db2a-0067-4a50-8487-b97fc3fdd122', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8cd27338-7640-4d03-958e-44ccc0e8c5fb, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b66066cc-97eb-4896-a98d-267498dedf74) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.339 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b66066cc-97eb-4896-a98d-267498dedf74 in datapath ba5c1b46-c606-429f-b268-8a88a7b3641a bound to our chassis#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.341 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ba5c1b46-c606-429f-b268-8a88a7b3641a#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.336 187212 INFO nova.compute.manager [None req-adb88aac-3418-476b-a369-a74b1f64653d 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Pausing#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.337 187212 DEBUG nova.objects.instance [None req-adb88aac-3418-476b-a369-a74b1f64653d 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'flavor' on Instance uuid 5659bd52-8c24-483d-80a4-8eb6b28e1349 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.348 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:08 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:08Z|00593|binding|INFO|Setting lport b66066cc-97eb-4896-a98d-267498dedf74 ovn-installed in OVS
Dec  5 07:08:08 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:08Z|00594|binding|INFO|Setting lport b66066cc-97eb-4896-a98d-267498dedf74 up in Southbound
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.351 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.355 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c7810d5c-3258-4ebe-bf3b-2a2c3958814c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.356 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapba5c1b46-c1 in ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.358 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapba5c1b46-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.358 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[86f03fa9-97cd-42b1-b3c3-64de9a93bdc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.361 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[970eeed6-87ac-4a6c-b184-93d4d687f13e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.371 187212 DEBUG nova.network.neutron [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:08:08 np0005546909 systemd-udevd[229265]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.378 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[1007c90a-5216-4ff0-a488-31fe20b42d15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 NetworkManager[55691]: <info>  [1764936488.3883] device (tapb66066cc-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:08:08 np0005546909 NetworkManager[55691]: <info>  [1764936488.3891] device (tapb66066cc-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.391 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936488.3902311, 5659bd52-8c24-483d-80a4-8eb6b28e1349 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.391 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.393 187212 DEBUG nova.compute.manager [None req-adb88aac-3418-476b-a369-a74b1f64653d 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.393 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a66f6823-90dc-487d-8b2f-129ffaeb4844]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 systemd-machined[153543]: New machine qemu-76-instance-00000045.
Dec  5 07:08:08 np0005546909 systemd[1]: Started Virtual Machine qemu-76-instance-00000045.
Dec  5 07:08:08 np0005546909 podman[229246]: 2025-12-05 12:08:08.40966295 +0000 UTC m=+0.091447544 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.423 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[149a93e7-0508-46d8-ba0d-c6eee761a83d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.427 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.428 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9a60589f-64b1-4415-a01e-cbdb35d7299f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 NetworkManager[55691]: <info>  [1764936488.4297] manager: (tapba5c1b46-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/243)
Dec  5 07:08:08 np0005546909 systemd-udevd[229274]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.434 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.463 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8d1974fd-1403-46fd-b19f-7afcccf0551b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.469 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cf506f8a-b53e-437b-b0cd-947cac4421af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.485 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Dec  5 07:08:08 np0005546909 NetworkManager[55691]: <info>  [1764936488.4926] device (tapba5c1b46-c0): carrier: link connected
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.497 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0f65f2c3-4fb6-46e5-8a31-627bcd39ab54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.513 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[911e12a8-c5c6-4f8e-b04b-ee94bb86df4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapba5c1b46-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:7b:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386905, 'reachable_time': 30980, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229305, 'error': None, 'target': 'ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.533 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7c01e9-32f2-4516-97b0-4dbe360a4e4d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:7ba4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386905, 'tstamp': 386905}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229306, 'error': None, 'target': 'ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.551 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f02c5b-3c79-46db-aa15-331774b9e78f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapba5c1b46-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:7b:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386905, 'reachable_time': 30980, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229307, 'error': None, 'target': 'ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.582 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ff62f8e8-6314-4909-b1f9-e00c0a1c67a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.638 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6e50d870-4c75-458b-9ad3-d74578bd6a98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.640 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba5c1b46-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.641 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.642 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapba5c1b46-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:08 np0005546909 kernel: tapba5c1b46-c0: entered promiscuous mode
Dec  5 07:08:08 np0005546909 NetworkManager[55691]: <info>  [1764936488.6447] manager: (tapba5c1b46-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.644 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.650 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapba5c1b46-c0, col_values=(('external_ids', {'iface-id': 'cf881e66-1434-41ee-aff2-459b4b74bf50'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.652 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:08 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:08Z|00595|binding|INFO|Releasing lport cf881e66-1434-41ee-aff2-459b4b74bf50 from this chassis (sb_readonly=0)
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.654 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.657 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ba5c1b46-c606-429f-b268-8a88a7b3641a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ba5c1b46-c606-429f-b268-8a88a7b3641a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.658 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[97efd1a4-9fbc-49e2-a9e5-7e5204cd9803]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.659 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-ba5c1b46-c606-429f-b268-8a88a7b3641a
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/ba5c1b46-c606-429f-b268-8a88a7b3641a.pid.haproxy
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID ba5c1b46-c606-429f-b268-8a88a7b3641a
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:08:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:08.662 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'env', 'PROCESS_TAG=haproxy-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ba5c1b46-c606-429f-b268-8a88a7b3641a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.667 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.726 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936488.7263088, 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.727 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] VM Started (Lifecycle Event)#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.747 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.751 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936488.7264965, 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.751 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.770 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.773 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:08:08 np0005546909 nova_compute[187208]: 2025-12-05 12:08:08.794 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:08:09 np0005546909 podman[229346]: 2025-12-05 12:08:09.052256248 +0000 UTC m=+0.027606983 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:08:09 np0005546909 podman[229346]: 2025-12-05 12:08:09.457454736 +0000 UTC m=+0.432805461 container create a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:08:09 np0005546909 systemd[1]: Started libpod-conmon-a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5.scope.
Dec  5 07:08:09 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:08:09 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4532b7759c030458be51c068284cef48e94b6377a17ce91486e109cbe6a7f64/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:08:09 np0005546909 podman[229346]: 2025-12-05 12:08:09.567670694 +0000 UTC m=+0.543021439 container init a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec  5 07:08:09 np0005546909 podman[229346]: 2025-12-05 12:08:09.573210445 +0000 UTC m=+0.548561170 container start a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:08:09 np0005546909 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [NOTICE]   (229365) : New worker (229367) forked
Dec  5 07:08:09 np0005546909 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [NOTICE]   (229365) : Loading success.
Dec  5 07:08:09 np0005546909 nova_compute[187208]: 2025-12-05 12:08:09.636 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:09 np0005546909 nova_compute[187208]: 2025-12-05 12:08:09.939 187212 DEBUG nova.policy [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:08:10 np0005546909 nova_compute[187208]: 2025-12-05 12:08:10.511 187212 DEBUG nova.network.neutron [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Successfully updated port: 11c7fa90-6a48-487a-a375-5adf7f41cb90 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:08:10 np0005546909 nova_compute[187208]: 2025-12-05 12:08:10.527 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:10 np0005546909 nova_compute[187208]: 2025-12-05 12:08:10.527 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:10 np0005546909 nova_compute[187208]: 2025-12-05 12:08:10.528 187212 DEBUG nova.network.neutron [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:08:10 np0005546909 nova_compute[187208]: 2025-12-05 12:08:10.826 187212 DEBUG nova.network.neutron [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:08:11 np0005546909 nova_compute[187208]: 2025-12-05 12:08:11.975 187212 DEBUG nova.network.neutron [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Updated VIF entry in instance network info cache for port b66066cc-97eb-4896-a98d-267498dedf74. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:08:11 np0005546909 nova_compute[187208]: 2025-12-05 12:08:11.975 187212 DEBUG nova.network.neutron [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Updating instance_info_cache with network_info: [{"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:11 np0005546909 nova_compute[187208]: 2025-12-05 12:08:11.980 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936476.9795291, 5d70ac2d-111f-4e1b-ac26-3e02849b0458 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:11 np0005546909 nova_compute[187208]: 2025-12-05 12:08:11.981 187212 INFO nova.compute.manager [-] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.003 187212 DEBUG nova.compute.manager [None req-4254625d-f9a2-4c87-898a-10f0443cfca9 - - - - - -] [instance: 5d70ac2d-111f-4e1b-ac26-3e02849b0458] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.004 187212 DEBUG oslo_concurrency.lockutils [req-1116509f-da6e-4113-b13c-c024312f0680 req-597fa8a5-6589-4152-ab37-f4f126b970ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.211 187212 DEBUG nova.network.neutron [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully created port: d35fce09-856e-4ebf-b944-0c0953a9492b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.395 187212 DEBUG nova.network.neutron [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.418 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.419 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance network_info: |[{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.421 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Start _get_guest_xml network_info=[{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.426 187212 WARNING nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.433 187212 DEBUG nova.virt.libvirt.host [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.477 187212 DEBUG nova.virt.libvirt.host [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.482 187212 DEBUG nova.virt.libvirt.host [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.483 187212 DEBUG nova.virt.libvirt.host [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.483 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.484 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.484 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.485 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.485 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.485 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.486 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.486 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.486 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.487 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.487 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.487 187212 DEBUG nova.virt.hardware [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.492 187212 DEBUG nova.virt.libvirt.vif [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1436335913',display_name='tempest-ServerRescueTestJSONUnderV235-server-1436335913',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1436335913',id=70,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e846fccb774e44f585d8847897bc4229',ramdisk_id='',reservation_id='r-230fx5t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1035500959',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1035500959-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:59Z,user_data=None,user_id='6a2cefdbcaae4db3b3ece95c8227d77e',uuid=2e537618-f998-4c4d-8e1e-e9cc79219330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.492 187212 DEBUG nova.network.os_vif_util [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converting VIF {"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.493 187212 DEBUG nova.network.os_vif_util [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.494 187212 DEBUG nova.objects.instance [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.506 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  <uuid>2e537618-f998-4c4d-8e1e-e9cc79219330</uuid>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  <name>instance-00000046</name>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-1436335913</nova:name>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:08:12</nova:creationTime>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:        <nova:user uuid="6a2cefdbcaae4db3b3ece95c8227d77e">tempest-ServerRescueTestJSONUnderV235-1035500959-project-member</nova:user>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:        <nova:project uuid="e846fccb774e44f585d8847897bc4229">tempest-ServerRescueTestJSONUnderV235-1035500959</nova:project>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:        <nova:port uuid="11c7fa90-6a48-487a-a375-5adf7f41cb90">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.2" ipVersion="4"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <entry name="serial">2e537618-f998-4c4d-8e1e-e9cc79219330</entry>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <entry name="uuid">2e537618-f998-4c4d-8e1e-e9cc79219330</entry>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:e4:ee:e4"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <target dev="tap11c7fa90-6a"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/console.log" append="off"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:08:12 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:08:12 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:08:12 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:08:12 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.508 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Preparing to wait for external event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.508 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.508 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.509 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.509 187212 DEBUG nova.virt.libvirt.vif [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1436335913',display_name='tempest-ServerRescueTestJSONUnderV235-server-1436335913',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1436335913',id=70,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e846fccb774e44f585d8847897bc4229',ramdisk_id='',reservation_id='r-230fx5t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1035500959',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1035500959-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:07:59Z,user_data=None,user_id='6a2cefdbcaae4db3b3ece95c8227d77e',uuid=2e537618-f998-4c4d-8e1e-e9cc79219330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.510 187212 DEBUG nova.network.os_vif_util [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converting VIF {"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.510 187212 DEBUG nova.network.os_vif_util [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.511 187212 DEBUG os_vif [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.511 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.512 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.512 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.515 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap11c7fa90-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.515 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap11c7fa90-6a, col_values=(('external_ids', {'iface-id': '11c7fa90-6a48-487a-a375-5adf7f41cb90', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:ee:e4', 'vm-uuid': '2e537618-f998-4c4d-8e1e-e9cc79219330'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.516 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:12 np0005546909 NetworkManager[55691]: <info>  [1764936492.5175] manager: (tap11c7fa90-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.519 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.523 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.524 187212 INFO os_vif [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a')#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.572 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.573 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.573 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No VIF found with MAC fa:16:3e:e4:ee:e4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.574 187212 INFO nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Using config drive#033[00m
Dec  5 07:08:12 np0005546909 nova_compute[187208]: 2025-12-05 12:08:12.915 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.295 187212 INFO nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Creating config drive at /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.301 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprmqzuqi_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:13Z|00596|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec  5 07:08:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:13Z|00597|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec  5 07:08:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:13Z|00598|binding|INFO|Releasing lport cf881e66-1434-41ee-aff2-459b4b74bf50 from this chassis (sb_readonly=0)
Dec  5 07:08:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:13Z|00599|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.406 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.433 187212 DEBUG oslo_concurrency.processutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprmqzuqi_" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:13 np0005546909 kernel: tap11c7fa90-6a: entered promiscuous mode
Dec  5 07:08:13 np0005546909 NetworkManager[55691]: <info>  [1764936493.4975] manager: (tap11c7fa90-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/246)
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.498 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:13Z|00600|binding|INFO|Claiming lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 for this chassis.
Dec  5 07:08:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:13Z|00601|binding|INFO|11c7fa90-6a48-487a-a375-5adf7f41cb90: Claiming fa:16:3e:e4:ee:e4 10.100.0.2
Dec  5 07:08:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:13.505 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:ee:e4 10.100.0.2'], port_security=['fa:16:3e:e4:ee:e4 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-034629ef-6cd1-463c-b963-3d0d9c530038', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e846fccb774e44f585d8847897bc4229', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a77f7593-d6d1-44fb-8125-66cdfc38709c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4c9d894-0fc3-4aad-a4d5-6bee101a530c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=11c7fa90-6a48-487a-a375-5adf7f41cb90) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:13.507 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 11c7fa90-6a48-487a-a375-5adf7f41cb90 in datapath 034629ef-6cd1-463c-b963-3d0d9c530038 bound to our chassis#033[00m
Dec  5 07:08:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:13.509 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 034629ef-6cd1-463c-b963-3d0d9c530038 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Dec  5 07:08:13 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:13.510 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[34af6233-215e-4e65-b3f5-ea644449d93e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:13Z|00602|binding|INFO|Setting lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 up in Southbound
Dec  5 07:08:13 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:13Z|00603|binding|INFO|Setting lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 ovn-installed in OVS
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.513 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.515 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:13 np0005546909 systemd-udevd[229395]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:08:13 np0005546909 systemd-machined[153543]: New machine qemu-77-instance-00000046.
Dec  5 07:08:13 np0005546909 NetworkManager[55691]: <info>  [1764936493.5438] device (tap11c7fa90-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:08:13 np0005546909 NetworkManager[55691]: <info>  [1764936493.5448] device (tap11c7fa90-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:08:13 np0005546909 systemd[1]: Started Virtual Machine qemu-77-instance-00000046.
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.781 187212 DEBUG nova.compute.manager [req-d9ef96de-c639-46d3-9eb7-212aa494b726 req-fb8ea353-8080-4b5b-8737-24367aaa8468 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.781 187212 DEBUG oslo_concurrency.lockutils [req-d9ef96de-c639-46d3-9eb7-212aa494b726 req-fb8ea353-8080-4b5b-8737-24367aaa8468 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.782 187212 DEBUG oslo_concurrency.lockutils [req-d9ef96de-c639-46d3-9eb7-212aa494b726 req-fb8ea353-8080-4b5b-8737-24367aaa8468 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.782 187212 DEBUG oslo_concurrency.lockutils [req-d9ef96de-c639-46d3-9eb7-212aa494b726 req-fb8ea353-8080-4b5b-8737-24367aaa8468 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.782 187212 DEBUG nova.compute.manager [req-d9ef96de-c639-46d3-9eb7-212aa494b726 req-fb8ea353-8080-4b5b-8737-24367aaa8468 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Processing event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.783 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.789 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936493.788678, 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.789 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.792 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.796 187212 INFO nova.virt.libvirt.driver [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Instance spawned successfully.#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.797 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.808 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.821 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.825 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.826 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.826 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.827 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.827 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.828 187212 DEBUG nova.virt.libvirt.driver [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.849 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.885 187212 INFO nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Took 19.58 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.886 187212 DEBUG nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:13 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.978 187212 INFO nova.compute.manager [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Took 20.29 seconds to build instance.#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:13.999 187212 DEBUG oslo_concurrency.lockutils [None req-79257b89-8901-4451-bb7d-fb6eae079caf 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.445s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.289 187212 DEBUG nova.network.neutron [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully updated port: d35fce09-856e-4ebf-b944-0c0953a9492b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.303 187212 DEBUG oslo_concurrency.lockutils [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.304 187212 DEBUG oslo_concurrency.lockutils [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.304 187212 DEBUG nova.network.neutron [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.375 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936494.3747003, 2e537618-f998-4c4d-8e1e-e9cc79219330 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.376 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] VM Started (Lifecycle Event)#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.393 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.396 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936494.3748746, 2e537618-f998-4c4d-8e1e-e9cc79219330 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.397 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.411 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.414 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.431 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:08:14 np0005546909 nova_compute[187208]: 2025-12-05 12:08:14.604 187212 WARNING nova.network.neutron [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it#033[00m
Dec  5 07:08:15 np0005546909 podman[229412]: 2025-12-05 12:08:15.208977336 +0000 UTC m=+0.058653414 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.698 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "5659bd52-8c24-483d-80a4-8eb6b28e1349" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.699 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.699 187212 INFO nova.compute.manager [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Shelving#033[00m
Dec  5 07:08:16 np0005546909 kernel: tap29e412e9-d3 (unregistering): left promiscuous mode
Dec  5 07:08:16 np0005546909 NetworkManager[55691]: <info>  [1764936496.7452] device (tap29e412e9-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:16 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:16Z|00604|binding|INFO|Releasing lport 29e412e9-d3cc-4af2-b85a-ab48fcad0372 from this chassis (sb_readonly=0)
Dec  5 07:08:16 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:16Z|00605|binding|INFO|Setting lport 29e412e9-d3cc-4af2-b85a-ab48fcad0372 down in Southbound
Dec  5 07:08:16 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:16Z|00606|binding|INFO|Removing iface tap29e412e9-d3 ovn-installed in OVS
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.758 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.770 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:32:38 10.100.0.6'], port_security=['fa:16:3e:68:32:38 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5659bd52-8c24-483d-80a4-8eb6b28e1349', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'df1c03c3-b3c9-47b6-a712-a13948dd510e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=29e412e9-d3cc-4af2-b85a-ab48fcad0372) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.771 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 29e412e9-d3cc-4af2-b85a-ab48fcad0372 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 unbound from our chassis#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.774 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.774 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.790 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f7d964fd-084c-4616-8cb6-e3be4ea26f7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:16 np0005546909 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000044.scope: Deactivated successfully.
Dec  5 07:08:16 np0005546909 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d00000044.scope: Consumed 6.328s CPU time.
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.821 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.821 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:16 np0005546909 systemd-machined[153543]: Machine qemu-75-instance-00000044 terminated.
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.825 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b803acea-5d7d-4afd-ab13-5561f7053a9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.828 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5428dbc5-556e-47d5-b622-df5a7365febe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.838 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.849 187212 DEBUG nova.network.neutron [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.859 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[12cc9e66-1e7c-4f8d-855b-dcf9d2a6f9ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.879 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fa992cce-ae68-43c4-8f77-06d49adaa9e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 22378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229449, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.884 187212 DEBUG oslo_concurrency.lockutils [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.889 187212 DEBUG nova.virt.libvirt.vif [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.889 187212 DEBUG nova.network.os_vif_util [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.890 187212 DEBUG nova.network.os_vif_util [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.891 187212 DEBUG os_vif [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.892 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.893 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.893 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.896 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a9cab09-8277-4c58-8b38-6e1571a20e1f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229450, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229450, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.898 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.902 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.903 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.904 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd35fce09-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.904 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd35fce09-85, col_values=(('external_ids', {'iface-id': 'd35fce09-856e-4ebf-b944-0c0953a9492b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:01:47', 'vm-uuid': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:16 np0005546909 NetworkManager[55691]: <info>  [1764936496.9071] manager: (tapd35fce09-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.909 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.909 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.910 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.910 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.910 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.912 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.915 187212 INFO os_vif [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85')#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.916 187212 DEBUG nova.virt.libvirt.vif [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.917 187212 DEBUG nova.network.os_vif_util [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.918 187212 DEBUG nova.network.os_vif_util [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.921 187212 DEBUG nova.virt.libvirt.guest [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] attach device xml: <interface type="ethernet">
Dec  5 07:08:16 np0005546909 nova_compute[187208]:  <mac address="fa:16:3e:b8:01:47"/>
Dec  5 07:08:16 np0005546909 nova_compute[187208]:  <model type="virtio"/>
Dec  5 07:08:16 np0005546909 nova_compute[187208]:  <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:08:16 np0005546909 nova_compute[187208]:  <mtu size="1442"/>
Dec  5 07:08:16 np0005546909 nova_compute[187208]:  <target dev="tapd35fce09-85"/>
Dec  5 07:08:16 np0005546909 nova_compute[187208]: </interface>
Dec  5 07:08:16 np0005546909 nova_compute[187208]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.934 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.935 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:16 np0005546909 systemd-udevd[229441]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:08:16 np0005546909 NetworkManager[55691]: <info>  [1764936496.9408] manager: (tapd35fce09-85): new Tun device (/org/freedesktop/NetworkManager/Devices/248)
Dec  5 07:08:16 np0005546909 kernel: tapd35fce09-85: entered promiscuous mode
Dec  5 07:08:16 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:16Z|00607|binding|INFO|Claiming lport d35fce09-856e-4ebf-b944-0c0953a9492b for this chassis.
Dec  5 07:08:16 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:16Z|00608|binding|INFO|d35fce09-856e-4ebf-b944-0c0953a9492b: Claiming fa:16:3e:b8:01:47 10.100.0.3
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.948 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.954 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.955 187212 INFO nova.compute.claims [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:08:16 np0005546909 NetworkManager[55691]: <info>  [1764936496.9588] device (tapd35fce09-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:08:16 np0005546909 NetworkManager[55691]: <info>  [1764936496.9601] device (tapd35fce09-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.954 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:01:47 10.100.0.3'], port_security=['fa:16:3e:b8:01:47 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d35fce09-856e-4ebf-b944-0c0953a9492b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.955 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d35fce09-856e-4ebf-b944-0c0953a9492b in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.958 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c#033[00m
Dec  5 07:08:16 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:16Z|00609|binding|INFO|Setting lport d35fce09-856e-4ebf-b944-0c0953a9492b ovn-installed in OVS
Dec  5 07:08:16 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:16Z|00610|binding|INFO|Setting lport d35fce09-856e-4ebf-b944-0c0953a9492b up in Southbound
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.972 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:16 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:16.975 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c63a37e-de31-4db2-b877-c3dd918b8440]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:16 np0005546909 nova_compute[187208]: 2025-12-05 12:08:16.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.005 187212 INFO nova.virt.libvirt.driver [-] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance destroyed successfully.#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.006 187212 DEBUG nova.objects.instance [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'numa_topology' on Instance uuid 5659bd52-8c24-483d-80a4-8eb6b28e1349 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.008 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[07046d03-7d85-4006-8cc5-5917f0ec47fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.011 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a4f46fef-fed7-45ef-8aed-c0020aec7a7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.038 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c8ee53-da90-4abf-8c23-86149699d4f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.042 187212 DEBUG nova.virt.libvirt.driver [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.042 187212 DEBUG nova.virt.libvirt.driver [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.042 187212 DEBUG nova.virt.libvirt.driver [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:01:99:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.042 187212 DEBUG nova.virt.libvirt.driver [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:b8:01:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.056 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2d20c361-8c6c-4a87-a7a3-48f0df83e14f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 35716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229484, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.066 187212 DEBUG nova.virt.libvirt.guest [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:08:17 np0005546909 nova_compute[187208]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:  <nova:creationTime>2025-12-05 12:08:17</nova:creationTime>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:  <nova:flavor name="m1.nano">
Dec  5 07:08:17 np0005546909 nova_compute[187208]:    <nova:memory>128</nova:memory>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:    <nova:disk>1</nova:disk>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:    <nova:swap>0</nova:swap>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:    <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:    <nova:vcpus>1</nova:vcpus>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:  </nova:flavor>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:  <nova:owner>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:    <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:    <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:  </nova:owner>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:  <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:  <nova:ports>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:    <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec  5 07:08:17 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:    <nova:port uuid="d35fce09-856e-4ebf-b944-0c0953a9492b">
Dec  5 07:08:17 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:08:17 np0005546909 nova_compute[187208]:  </nova:ports>
Dec  5 07:08:17 np0005546909 nova_compute[187208]: </nova:instance>
Dec  5 07:08:17 np0005546909 nova_compute[187208]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec  5 07:08:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.072 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[706e7789-f569-464e-b29e-e7bfbfe80b88]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383494, 'tstamp': 383494}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229485, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383497, 'tstamp': 383497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229485, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.073 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.075 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.078 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.079 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.079 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.080 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:17.080 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.095 187212 DEBUG oslo_concurrency.lockutils [None req-cbb0f4f9-6505-4620-89f6-25eefcb100fd 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.802s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.188 187212 DEBUG nova.compute.provider_tree [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.204 187212 DEBUG nova.scheduler.client.report [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.232 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.232 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.289 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.290 187212 DEBUG nova.network.neutron [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.316 187212 INFO nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.336 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.353 187212 INFO nova.virt.libvirt.driver [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Beginning cold snapshot process#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.446 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.447 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.447 187212 INFO nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Creating image(s)#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.448 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "/var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.448 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "/var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.448 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "/var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.463 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.507 187212 DEBUG nova.privsep.utils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.507 187212 DEBUG oslo_concurrency.processutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk /var/lib/nova/instances/snapshots/tmpwfyho5o3/f6ab3485ca4a4553bc3b5ba4601e8af8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.526 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.527 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.528 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.543 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.620 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.621 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.690 187212 DEBUG oslo_concurrency.processutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349/disk /var/lib/nova/instances/snapshots/tmpwfyho5o3/f6ab3485ca4a4553bc3b5ba4601e8af8" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.691 187212 INFO nova.virt.libvirt.driver [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.695 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk 1073741824" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.697 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.697 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.770 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.771 187212 DEBUG nova.virt.disk.api [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Checking if we can resize image /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.772 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.795 187212 DEBUG nova.policy [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.829 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.830 187212 DEBUG nova.virt.disk.api [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Cannot resize image /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.830 187212 DEBUG nova.objects.instance [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lazy-loading 'migration_context' on Instance uuid b235a96f-7a12-4bd2-8627-33b128346aa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.843 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.844 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Ensure instance console log exists: /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.844 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.845 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.845 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:17 np0005546909 nova_compute[187208]: 2025-12-05 12:08:17.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:18 np0005546909 nova_compute[187208]: 2025-12-05 12:08:18.616 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:18 np0005546909 nova_compute[187208]: 2025-12-05 12:08:18.617 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing instance network info cache due to event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:08:18 np0005546909 nova_compute[187208]: 2025-12-05 12:08:18.617 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:18 np0005546909 nova_compute[187208]: 2025-12-05 12:08:18.617 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:18 np0005546909 nova_compute[187208]: 2025-12-05 12:08:18.617 187212 DEBUG nova.network.neutron [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:08:18 np0005546909 nova_compute[187208]: 2025-12-05 12:08:18.619 187212 DEBUG nova.network.neutron [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Successfully created port: df4eecd2-b2e2-445a-acac-232f66123555 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:08:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:19Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b8:01:47 10.100.0.3
Dec  5 07:08:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:19Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b8:01:47 10.100.0.3
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.509 187212 DEBUG nova.compute.manager [req-90bf95d1-8c79-4a87-b1cf-3c527793d704 req-7704b739-a146-4b45-b2e5-6944a4a39144 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Received event network-vif-unplugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.510 187212 DEBUG oslo_concurrency.lockutils [req-90bf95d1-8c79-4a87-b1cf-3c527793d704 req-7704b739-a146-4b45-b2e5-6944a4a39144 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.510 187212 DEBUG oslo_concurrency.lockutils [req-90bf95d1-8c79-4a87-b1cf-3c527793d704 req-7704b739-a146-4b45-b2e5-6944a4a39144 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.510 187212 DEBUG oslo_concurrency.lockutils [req-90bf95d1-8c79-4a87-b1cf-3c527793d704 req-7704b739-a146-4b45-b2e5-6944a4a39144 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.510 187212 DEBUG nova.compute.manager [req-90bf95d1-8c79-4a87-b1cf-3c527793d704 req-7704b739-a146-4b45-b2e5-6944a4a39144 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] No waiting events found dispatching network-vif-unplugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.511 187212 WARNING nova.compute.manager [req-90bf95d1-8c79-4a87-b1cf-3c527793d704 req-7704b739-a146-4b45-b2e5-6944a4a39144 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Received unexpected event network-vif-unplugged-29e412e9-d3cc-4af2-b85a-ab48fcad0372 for instance with vm_state paused and task_state shelving_image_uploading.#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.803 187212 INFO nova.virt.libvirt.driver [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Snapshot image upload complete#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.804 187212 DEBUG nova.compute.manager [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.858 187212 INFO nova.compute.manager [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Shelve offloading#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.865 187212 INFO nova.virt.libvirt.driver [-] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance destroyed successfully.#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.865 187212 DEBUG nova.compute.manager [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.868 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.868 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquired lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:19 np0005546909 nova_compute[187208]: 2025-12-05 12:08:19.869 187212 DEBUG nova.network.neutron [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.043 187212 DEBUG nova.network.neutron [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Successfully updated port: df4eecd2-b2e2-445a-acac-232f66123555 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.061 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "refresh_cache-b235a96f-7a12-4bd2-8627-33b128346aa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.061 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquired lock "refresh_cache-b235a96f-7a12-4bd2-8627-33b128346aa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.061 187212 DEBUG nova.network.neutron [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.063 187212 DEBUG oslo_concurrency.lockutils [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.063 187212 DEBUG oslo_concurrency.lockutils [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.063 187212 DEBUG nova.objects.instance [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.310 187212 DEBUG nova.network.neutron [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.346 187212 DEBUG nova.network.neutron [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updated VIF entry in instance network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.347 187212 DEBUG nova.network.neutron [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.379 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.380 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.380 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.380 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.381 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.381 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] No waiting events found dispatching network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.381 187212 WARNING nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received unexpected event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.381 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.381 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.382 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.382 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.382 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Processing event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.382 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.382 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.383 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.383 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.383 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.383 187212 WARNING nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.384 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-changed-d35fce09-856e-4ebf-b944-0c0953a9492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.384 187212 DEBUG nova.compute.manager [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing instance network info cache due to event network-changed-d35fce09-856e-4ebf-b944-0c0953a9492b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.384 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.384 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.384 187212 DEBUG nova.network.neutron [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing network info cache for port d35fce09-856e-4ebf-b944-0c0953a9492b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.393 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.397 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936500.3971863, 2e537618-f998-4c4d-8e1e-e9cc79219330 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.397 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.399 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.402 187212 INFO nova.virt.libvirt.driver [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance spawned successfully.#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.403 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.429 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.434 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.439 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.439 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.439 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.440 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.440 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.441 187212 DEBUG nova.virt.libvirt.driver [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.471 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.508 187212 INFO nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Took 20.62 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.509 187212 DEBUG nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.575 187212 INFO nova.compute.manager [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Took 21.48 seconds to build instance.#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.595 187212 DEBUG oslo_concurrency.lockutils [None req-6ae30eea-39c4-4119-a479-e459e4fd209e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.714 187212 DEBUG nova.objects.instance [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_requests' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:20 np0005546909 nova_compute[187208]: 2025-12-05 12:08:20.736 187212 DEBUG nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.106 187212 DEBUG nova.policy [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.712 187212 DEBUG nova.network.neutron [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Updating instance_info_cache with network_info: [{"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.740 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Releasing lock "refresh_cache-b235a96f-7a12-4bd2-8627-33b128346aa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.740 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Instance network_info: |[{"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.742 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Start _get_guest_xml network_info=[{"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.747 187212 WARNING nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.753 187212 DEBUG nova.network.neutron [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Updating instance_info_cache with network_info: [{"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.754 187212 DEBUG nova.virt.libvirt.host [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.755 187212 DEBUG nova.virt.libvirt.host [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.758 187212 DEBUG nova.virt.libvirt.host [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.758 187212 DEBUG nova.virt.libvirt.host [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.758 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.759 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.759 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.759 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.760 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.760 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.760 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.760 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.760 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.761 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.761 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.761 187212 DEBUG nova.virt.hardware [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.765 187212 DEBUG nova.virt.libvirt.vif [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:08:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-959694714',display_name='tempest-ServerMetadataNegativeTestJSON-server-959694714',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-959694714',id=71,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bf30ed1956544c7eae67c989042126e4',ramdisk_id='',reservation_id='r-h9tu7hr7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-91345283',owner_user_name='tempest-ServerMetadataNegativeTestJSON-91345283-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:08:17Z,user_data=None,user_id='132d581de02e49b9a4c99b9b831dd5b5',uuid=b235a96f-7a12-4bd2-8627-33b128346aa4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.765 187212 DEBUG nova.network.os_vif_util [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Converting VIF {"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.766 187212 DEBUG nova.network.os_vif_util [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.767 187212 DEBUG nova.objects.instance [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lazy-loading 'pci_devices' on Instance uuid b235a96f-7a12-4bd2-8627-33b128346aa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.769 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Releasing lock "refresh_cache-5659bd52-8c24-483d-80a4-8eb6b28e1349" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.780 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  <uuid>b235a96f-7a12-4bd2-8627-33b128346aa4</uuid>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  <name>instance-00000047</name>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerMetadataNegativeTestJSON-server-959694714</nova:name>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:08:21</nova:creationTime>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:        <nova:user uuid="132d581de02e49b9a4c99b9b831dd5b5">tempest-ServerMetadataNegativeTestJSON-91345283-project-member</nova:user>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:        <nova:project uuid="bf30ed1956544c7eae67c989042126e4">tempest-ServerMetadataNegativeTestJSON-91345283</nova:project>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:        <nova:port uuid="df4eecd2-b2e2-445a-acac-232f66123555">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <entry name="serial">b235a96f-7a12-4bd2-8627-33b128346aa4</entry>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <entry name="uuid">b235a96f-7a12-4bd2-8627-33b128346aa4</entry>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.config"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:40:3b:49"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <target dev="tapdf4eecd2-b2"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/console.log" append="off"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:08:21 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:08:21 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:08:21 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:08:21 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.781 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Preparing to wait for external event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.781 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.781 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.781 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.782 187212 DEBUG nova.virt.libvirt.vif [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:08:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-959694714',display_name='tempest-ServerMetadataNegativeTestJSON-server-959694714',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-959694714',id=71,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bf30ed1956544c7eae67c989042126e4',ramdisk_id='',reservation_id='r-h9tu7hr7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-91345283',owner_user_name='tempest-ServerMetadataNegativeTestJSON-91345283-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:08:17Z,user_data=None,user_id='132d581de02e49b9a4c99b9b831dd5b5',uuid=b235a96f-7a12-4bd2-8627-33b128346aa4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.782 187212 DEBUG nova.network.os_vif_util [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Converting VIF {"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.783 187212 DEBUG nova.network.os_vif_util [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.783 187212 DEBUG os_vif [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.784 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.784 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.788 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.788 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf4eecd2-b2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.788 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdf4eecd2-b2, col_values=(('external_ids', {'iface-id': 'df4eecd2-b2e2-445a-acac-232f66123555', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:40:3b:49', 'vm-uuid': 'b235a96f-7a12-4bd2-8627-33b128346aa4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:21 np0005546909 NetworkManager[55691]: <info>  [1764936501.7915] manager: (tapdf4eecd2-b2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.793 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.797 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.798 187212 INFO os_vif [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2')#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.850 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.850 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.851 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] No VIF found with MAC fa:16:3e:40:3b:49, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:21 np0005546909 nova_compute[187208]: 2025-12-05 12:08:21.852 187212 INFO nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Using config drive#033[00m
Dec  5 07:08:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:22Z|00611|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec  5 07:08:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:22Z|00612|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec  5 07:08:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:22Z|00613|binding|INFO|Releasing lport cf881e66-1434-41ee-aff2-459b4b74bf50 from this chassis (sb_readonly=0)
Dec  5 07:08:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:22Z|00614|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.390 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.483 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.483 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.484 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.484 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.484 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.485 187212 INFO nova.compute.manager [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Terminating instance#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.486 187212 DEBUG nova.compute.manager [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:08:22 np0005546909 kernel: tapb66066cc-97 (unregistering): left promiscuous mode
Dec  5 07:08:22 np0005546909 NetworkManager[55691]: <info>  [1764936502.5138] device (tapb66066cc-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.574 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:22Z|00615|binding|INFO|Releasing lport b66066cc-97eb-4896-a98d-267498dedf74 from this chassis (sb_readonly=0)
Dec  5 07:08:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:22Z|00616|binding|INFO|Setting lport b66066cc-97eb-4896-a98d-267498dedf74 down in Southbound
Dec  5 07:08:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:22Z|00617|binding|INFO|Removing iface tapb66066cc-97 ovn-installed in OVS
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:22.588 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:b8:fe 10.100.0.8'], port_security=['fa:16:3e:b2:b8:fe 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'feb2d7c8b49945a08355fc4f902f2786', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fdf3db2a-0067-4a50-8487-b97fc3fdd122', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8cd27338-7640-4d03-958e-44ccc0e8c5fb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b66066cc-97eb-4896-a98d-267498dedf74) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:22.589 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b66066cc-97eb-4896-a98d-267498dedf74 in datapath ba5c1b46-c606-429f-b268-8a88a7b3641a unbound from our chassis#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.589 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:22.591 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ba5c1b46-c606-429f-b268-8a88a7b3641a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:08:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:22.592 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d2a10c32-6f74-442c-a9b9-dad1f7ffe488]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:22.593 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a namespace which is not needed anymore#033[00m
Dec  5 07:08:22 np0005546909 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000045.scope: Deactivated successfully.
Dec  5 07:08:22 np0005546909 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d00000045.scope: Consumed 8.913s CPU time.
Dec  5 07:08:22 np0005546909 systemd-machined[153543]: Machine qemu-76-instance-00000045 terminated.
Dec  5 07:08:22 np0005546909 podman[229514]: 2025-12-05 12:08:22.673913196 +0000 UTC m=+0.099442277 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec  5 07:08:22 np0005546909 NetworkManager[55691]: <info>  [1764936502.7110] manager: (tapb66066cc-97): new Tun device (/org/freedesktop/NetworkManager/Devices/250)
Dec  5 07:08:22 np0005546909 systemd-udevd[229521]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.777 187212 INFO nova.virt.libvirt.driver [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Instance destroyed successfully.#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.778 187212 DEBUG nova.objects.instance [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lazy-loading 'resources' on Instance uuid 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.793 187212 DEBUG nova.virt.libvirt.vif [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1497746963',display_name='tempest-ServerAddressesNegativeTestJSON-server-1497746963',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1497746963',id=69,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:08:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='feb2d7c8b49945a08355fc4f902f2786',ramdisk_id='',reservation_id='r-wvd553zf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesNegativeTestJSON-717599576',owner_user_name='tempest-ServerAddressesNegativeTestJSON-717599576-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:08:13Z,user_data=None,user_id='21ddc7a76417447daa2a5a26cdf17d53',uuid=3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.793 187212 DEBUG nova.network.os_vif_util [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Converting VIF {"id": "b66066cc-97eb-4896-a98d-267498dedf74", "address": "fa:16:3e:b2:b8:fe", "network": {"id": "ba5c1b46-c606-429f-b268-8a88a7b3641a", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1150968245-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "feb2d7c8b49945a08355fc4f902f2786", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb66066cc-97", "ovs_interfaceid": "b66066cc-97eb-4896-a98d-267498dedf74", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.794 187212 DEBUG nova.network.os_vif_util [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.794 187212 DEBUG os_vif [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.796 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.796 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb66066cc-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.800 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.806 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.809 187212 INFO os_vif [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:b8:fe,bridge_name='br-int',has_traffic_filtering=True,id=b66066cc-97eb-4896-a98d-267498dedf74,network=Network(ba5c1b46-c606-429f-b268-8a88a7b3641a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb66066cc-97')#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.810 187212 INFO nova.virt.libvirt.driver [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Deleting instance files /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf_del#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.811 187212 INFO nova.virt.libvirt.driver [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Deletion of /var/lib/nova/instances/3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf_del complete#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.856 187212 INFO nova.compute.manager [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.857 187212 DEBUG oslo.service.loopingcall [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.857 187212 DEBUG nova.compute.manager [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.858 187212 DEBUG nova.network.neutron [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.891 187212 INFO nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Creating config drive at /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.config#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.896 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2bfra8dh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.924 187212 DEBUG nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully created port: af04237a-1f79-4f68-a18e-1ceb4911605b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:08:22 np0005546909 nova_compute[187208]: 2025-12-05 12:08:22.932 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:22 np0005546909 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [NOTICE]   (229365) : haproxy version is 2.8.14-c23fe91
Dec  5 07:08:22 np0005546909 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [NOTICE]   (229365) : path to executable is /usr/sbin/haproxy
Dec  5 07:08:22 np0005546909 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [WARNING]  (229365) : Exiting Master process...
Dec  5 07:08:22 np0005546909 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [ALERT]    (229365) : Current worker (229367) exited with code 143 (Terminated)
Dec  5 07:08:22 np0005546909 neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a[229361]: [WARNING]  (229365) : All workers exited. Exiting... (0)
Dec  5 07:08:22 np0005546909 systemd[1]: libpod-a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5.scope: Deactivated successfully.
Dec  5 07:08:22 np0005546909 podman[229557]: 2025-12-05 12:08:22.944444566 +0000 UTC m=+0.239881422 container died a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.028 187212 DEBUG oslo_concurrency.processutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2bfra8dh" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.087 187212 DEBUG nova.network.neutron [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updated VIF entry in instance network info cache for port d35fce09-856e-4ebf-b944-0c0953a9492b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.087 187212 DEBUG nova.network.neutron [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:23 np0005546909 kernel: tapdf4eecd2-b2: entered promiscuous mode
Dec  5 07:08:23 np0005546909 NetworkManager[55691]: <info>  [1764936503.0976] manager: (tapdf4eecd2-b2): new Tun device (/org/freedesktop/NetworkManager/Devices/251)
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.101 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:23 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:23Z|00618|binding|INFO|Claiming lport df4eecd2-b2e2-445a-acac-232f66123555 for this chassis.
Dec  5 07:08:23 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:23Z|00619|binding|INFO|df4eecd2-b2e2-445a-acac-232f66123555: Claiming fa:16:3e:40:3b:49 10.100.0.11
Dec  5 07:08:23 np0005546909 NetworkManager[55691]: <info>  [1764936503.1100] device (tapdf4eecd2-b2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:08:23 np0005546909 NetworkManager[55691]: <info>  [1764936503.1115] device (tapdf4eecd2-b2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:08:23 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:23Z|00620|binding|INFO|Setting lport df4eecd2-b2e2-445a-acac-232f66123555 ovn-installed in OVS
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.122 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.124 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:23 np0005546909 systemd-machined[153543]: New machine qemu-78-instance-00000047.
Dec  5 07:08:23 np0005546909 systemd[1]: Started Virtual Machine qemu-78-instance-00000047.
Dec  5 07:08:23 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:23Z|00621|binding|INFO|Setting lport df4eecd2-b2e2-445a-acac-232f66123555 up in Southbound
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.179 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:3b:49 10.100.0.11'], port_security=['fa:16:3e:40:3b:49 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf30ed1956544c7eae67c989042126e4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c4ee2104-41f1-480e-ab3a-db882b9c2d98', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bb90128-3616-41a6-a999-156ce64fbcf7, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=df4eecd2-b2e2-445a-acac-232f66123555) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.202 187212 DEBUG oslo_concurrency.lockutils [req-5a740367-b8fb-448d-8b04-24ea94e1c3d7 req-9cb88eff-12c8-42d5-aee9-4c4134cbd493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:23 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5-userdata-shm.mount: Deactivated successfully.
Dec  5 07:08:23 np0005546909 systemd[1]: var-lib-containers-storage-overlay-c4532b7759c030458be51c068284cef48e94b6377a17ce91486e109cbe6a7f64-merged.mount: Deactivated successfully.
Dec  5 07:08:23 np0005546909 podman[229557]: 2025-12-05 12:08:23.332483297 +0000 UTC m=+0.627920143 container cleanup a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:08:23 np0005546909 systemd[1]: libpod-conmon-a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5.scope: Deactivated successfully.
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.541 187212 DEBUG nova.network.neutron [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.558 187212 INFO nova.compute.manager [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Took 0.70 seconds to deallocate network for instance.#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.600 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.600 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:23 np0005546909 podman[229629]: 2025-12-05 12:08:23.655398847 +0000 UTC m=+0.296014991 container remove a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.662 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d43d98ce-6f1a-4bc9-b456-621a19ac3e04]: (4, ('Fri Dec  5 12:08:22 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a (a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5)\na66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5\nFri Dec  5 12:08:23 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a (a66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5)\na66f9cd25c42ab1ba49898fadc1746edd658e5ad011664c64b4aa32a531f89d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.667 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[88ccd2b5-da90-4d5a-b525-264d03ac3fe9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.668 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba5c1b46-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.700 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:23 np0005546909 kernel: tapba5c1b46-c0: left promiscuous mode
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.721 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.724 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.729 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a0be8750-06d0-4fdf-b697-3b5259fa6b0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.743 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f64e9cb-346b-4319-8f15-c291a8cfbbd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.744 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[254951da-7c25-4279-b2f5-e023d326aba6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.761 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[538f8118-7705-4c28-9355-720f91e8a10e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386898, 'reachable_time': 35191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229647, 'error': None, 'target': 'ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 systemd[1]: run-netns-ovnmeta\x2dba5c1b46\x2dc606\x2d429f\x2db268\x2d8a88a7b3641a.mount: Deactivated successfully.
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.766 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ba5c1b46-c606-429f-b268-8a88a7b3641a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.766 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a95a30ea-d5bf-435e-b58d-c18fb195322b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.767 104471 INFO neutron.agent.ovn.metadata.agent [-] Port df4eecd2-b2e2-445a-acac-232f66123555 in datapath 02d8cc87-efdf-4db2-b7ab-393e2480966a unbound from our chassis#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.770 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 02d8cc87-efdf-4db2-b7ab-393e2480966a#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.771 187212 DEBUG nova.compute.provider_tree [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.781 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cecd87c7-ed40-48ba-8440-0c66d3cc5949]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.782 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap02d8cc87-e1 in ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.784 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap02d8cc87-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.784 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[91af8f47-5d2a-4a62-8afb-85ffbc7ed9b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.785 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5cfe9c1f-0bd3-44e9-aa77-86a35850a6a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.784 187212 DEBUG nova.scheduler.client.report [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.801 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[5187c169-b2ea-4c8d-b692-1b92ce38ad4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.806 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.827 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f40dd680-b448-4af6-b5c1-a237e078ce68]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.829 187212 INFO nova.scheduler.client.report [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Deleted allocations for instance 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.856 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4a0780af-45c9-4d41-9552-2b13fda9c371]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.861 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7baddfa4-e467-4d8d-864e-cfdc8649f2ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 NetworkManager[55691]: <info>  [1764936503.8633] manager: (tap02d8cc87-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/252)
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.894 187212 DEBUG oslo_concurrency.lockutils [None req-1775fb19-e376-4f65-8ce0-8fe2c0232f5f 21ddc7a76417447daa2a5a26cdf17d53 feb2d7c8b49945a08355fc4f902f2786 - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.895 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a454ff82-0af6-4ceb-a689-094a9f685687]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.900 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a63bb927-13e1-43ed-8441-f2e638e53575]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 NetworkManager[55691]: <info>  [1764936503.9229] device (tap02d8cc87-e0): carrier: link connected
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.926 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[1624e3e2-c040-42e3-9055-5bd203a22b9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.944 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0a0ae1-77c9-4f99-8a6b-4eb6ca20b572]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02d8cc87-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:79:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388448, 'reachable_time': 24812, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229683, 'error': None, 'target': 'ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.961 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac907d3-af80-4a06-abdf-ec956fea284c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:795a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388448, 'tstamp': 388448}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229685, 'error': None, 'target': 'ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:23.976 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[36e27eb3-e62f-4a94-b27f-571766fba41b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap02d8cc87-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:79:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388448, 'reachable_time': 27146, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229686, 'error': None, 'target': 'ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.982 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936503.9821005, b235a96f-7a12-4bd2-8627-33b128346aa4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:23 np0005546909 nova_compute[187208]: 2025-12-05 12:08:23.983 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] VM Started (Lifecycle Event)#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.004 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.009 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936503.9822052, b235a96f-7a12-4bd2-8627-33b128346aa4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.010 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.017 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[837b553a-49c7-4f9d-ba9e-c6494e537b50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.037 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.041 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.069 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.073 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8d8e6db9-b36f-4000-8b61-30ee798359ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.075 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02d8cc87-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.075 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.076 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap02d8cc87-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.078 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:24 np0005546909 NetworkManager[55691]: <info>  [1764936504.0792] manager: (tap02d8cc87-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/253)
Dec  5 07:08:24 np0005546909 kernel: tap02d8cc87-e0: entered promiscuous mode
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.084 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.088 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap02d8cc87-e0, col_values=(('external_ids', {'iface-id': '0dffa729-6b55-4e58-afef-f1cdc22c22fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.089 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:24 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:24Z|00622|binding|INFO|Releasing lport 0dffa729-6b55-4e58-afef-f1cdc22c22fb from this chassis (sb_readonly=0)
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.106 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.107 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.108 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/02d8cc87-efdf-4db2-b7ab-393e2480966a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/02d8cc87-efdf-4db2-b7ab-393e2480966a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.110 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8eb2f136-ff40-46b9-8250-07a167e8dbd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.111 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-02d8cc87-efdf-4db2-b7ab-393e2480966a
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/02d8cc87-efdf-4db2-b7ab-393e2480966a.pid.haproxy
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 02d8cc87-efdf-4db2-b7ab-393e2480966a
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:08:24 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:24.112 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'env', 'PROCESS_TAG=haproxy-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/02d8cc87-efdf-4db2-b7ab-393e2480966a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.190 187212 DEBUG nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.190 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.191 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.191 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.191 187212 DEBUG nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.191 187212 WARNING nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b for instance with vm_state active and task_state None.#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.191 187212 DEBUG nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.192 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.192 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.192 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.192 187212 DEBUG nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.192 187212 WARNING nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b for instance with vm_state active and task_state None.#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.193 187212 DEBUG nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received event network-changed-df4eecd2-b2e2-445a-acac-232f66123555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.193 187212 DEBUG nova.compute.manager [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Refreshing instance network info cache due to event network-changed-df4eecd2-b2e2-445a-acac-232f66123555. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.193 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-b235a96f-7a12-4bd2-8627-33b128346aa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.193 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-b235a96f-7a12-4bd2-8627-33b128346aa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.193 187212 DEBUG nova.network.neutron [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Refreshing network info cache for port df4eecd2-b2e2-445a-acac-232f66123555 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.444 187212 INFO nova.virt.libvirt.driver [-] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Instance destroyed successfully.#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.445 187212 DEBUG nova.objects.instance [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'resources' on Instance uuid 5659bd52-8c24-483d-80a4-8eb6b28e1349 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.459 187212 DEBUG nova.virt.libvirt.vif [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1539570170',display_name='tempest-ServerActionsTestOtherB-server-1539570170',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1539570170',id=68,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:08:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-bfsh2n18',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member',shelved_at='2025-12-05T12:08:19.804531',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='4f3e32d3-f28d-4124-97de-ec6d4f73bf1d'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:08:17Z,user_data=None,user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=5659bd52-8c24-483d-80a4-8eb6b28e1349,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.460 187212 DEBUG nova.network.os_vif_util [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "address": "fa:16:3e:68:32:38", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29e412e9-d3", "ovs_interfaceid": "29e412e9-d3cc-4af2-b85a-ab48fcad0372", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.461 187212 DEBUG nova.network.os_vif_util [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.461 187212 DEBUG os_vif [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.464 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.464 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29e412e9-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.466 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.467 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.471 187212 INFO os_vif [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:32:38,bridge_name='br-int',has_traffic_filtering=True,id=29e412e9-d3cc-4af2-b85a-ab48fcad0372,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29e412e9-d3')#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.472 187212 INFO nova.virt.libvirt.driver [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Deleting instance files /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349_del#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.473 187212 INFO nova.virt.libvirt.driver [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Deletion of /var/lib/nova/instances/5659bd52-8c24-483d-80a4-8eb6b28e1349_del complete#033[00m
Dec  5 07:08:24 np0005546909 podman[229721]: 2025-12-05 12:08:24.539977045 +0000 UTC m=+0.066084348 container create 44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.564 187212 INFO nova.scheduler.client.report [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Deleted allocations for instance 5659bd52-8c24-483d-80a4-8eb6b28e1349#033[00m
Dec  5 07:08:24 np0005546909 systemd[1]: Started libpod-conmon-44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1.scope.
Dec  5 07:08:24 np0005546909 podman[229721]: 2025-12-05 12:08:24.507478202 +0000 UTC m=+0.033585525 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.606 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.606 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:24 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:08:24 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3936c809ee6a8001892b1f5e8b230731bdba206d1aa032e18836cd92f8d64675/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:08:24 np0005546909 podman[229721]: 2025-12-05 12:08:24.633142979 +0000 UTC m=+0.159250302 container init 44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:08:24 np0005546909 podman[229721]: 2025-12-05 12:08:24.638812103 +0000 UTC m=+0.164919406 container start 44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:08:24 np0005546909 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [NOTICE]   (229739) : New worker (229741) forked
Dec  5 07:08:24 np0005546909 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [NOTICE]   (229739) : Loading success.
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.761 187212 DEBUG nova.compute.provider_tree [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.779 187212 DEBUG nova.scheduler.client.report [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.808 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:24 np0005546909 nova_compute[187208]: 2025-12-05 12:08:24.862 187212 DEBUG oslo_concurrency.lockutils [None req-a4c0ed9f-9c38-43ff-a047-d3ada4864002 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "5659bd52-8c24-483d-80a4-8eb6b28e1349" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 8.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:25 np0005546909 nova_compute[187208]: 2025-12-05 12:08:25.090 187212 DEBUG nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully updated port: af04237a-1f79-4f68-a18e-1ceb4911605b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:08:25 np0005546909 nova_compute[187208]: 2025-12-05 12:08:25.104 187212 DEBUG oslo_concurrency.lockutils [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:25 np0005546909 nova_compute[187208]: 2025-12-05 12:08:25.105 187212 DEBUG oslo_concurrency.lockutils [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:25 np0005546909 nova_compute[187208]: 2025-12-05 12:08:25.105 187212 DEBUG nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:08:25 np0005546909 nova_compute[187208]: 2025-12-05 12:08:25.381 187212 WARNING nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it#033[00m
Dec  5 07:08:25 np0005546909 nova_compute[187208]: 2025-12-05 12:08:25.382 187212 WARNING nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it#033[00m
Dec  5 07:08:25 np0005546909 nova_compute[187208]: 2025-12-05 12:08:25.739 187212 DEBUG nova.network.neutron [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Updated VIF entry in instance network info cache for port df4eecd2-b2e2-445a-acac-232f66123555. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:08:25 np0005546909 nova_compute[187208]: 2025-12-05 12:08:25.739 187212 DEBUG nova.network.neutron [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Updating instance_info_cache with network_info: [{"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:25 np0005546909 nova_compute[187208]: 2025-12-05 12:08:25.760 187212 DEBUG oslo_concurrency.lockutils [req-47783d85-6e6b-4b2f-a74e-64e33a082b30 req-a6eef540-eb3f-4022-b4ea-a383e56d6185 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-b235a96f-7a12-4bd2-8627-33b128346aa4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.565 187212 INFO nova.compute.manager [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Rescuing#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.566 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.566 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.567 187212 DEBUG nova.network.neutron [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.922 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.972 187212 DEBUG nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received event network-vif-unplugged-b66066cc-97eb-4896-a98d-267498dedf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.972 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.972 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 DEBUG nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] No waiting events found dispatching network-vif-unplugged-b66066cc-97eb-4896-a98d-267498dedf74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 WARNING nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received unexpected event network-vif-unplugged-b66066cc-97eb-4896-a98d-267498dedf74 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 DEBUG nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.973 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.974 187212 DEBUG nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] No waiting events found dispatching network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.974 187212 WARNING nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received unexpected event network-vif-plugged-b66066cc-97eb-4896-a98d-267498dedf74 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.974 187212 DEBUG nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-changed-af04237a-1f79-4f68-a18e-1ceb4911605b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.974 187212 DEBUG nova.compute.manager [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing instance network info cache due to event network-changed-af04237a-1f79-4f68-a18e-1ceb4911605b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:08:27 np0005546909 nova_compute[187208]: 2025-12-05 12:08:27.974 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.866 187212 DEBUG nova.network.neutron [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.928 187212 DEBUG oslo_concurrency.lockutils [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.929 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.929 187212 DEBUG nova.network.neutron [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing network info cache for port af04237a-1f79-4f68-a18e-1ceb4911605b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.932 187212 DEBUG nova.virt.libvirt.vif [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.932 187212 DEBUG nova.network.os_vif_util [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.933 187212 DEBUG nova.network.os_vif_util [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.933 187212 DEBUG os_vif [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.934 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.934 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.934 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.937 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.937 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf04237a-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.937 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaf04237a-1f, col_values=(('external_ids', {'iface-id': 'af04237a-1f79-4f68-a18e-1ceb4911605b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:f6:34', 'vm-uuid': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.939 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:28 np0005546909 NetworkManager[55691]: <info>  [1764936508.9399] manager: (tapaf04237a-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.948 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.953 187212 INFO os_vif [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f')#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.956 187212 DEBUG nova.virt.libvirt.vif [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.956 187212 DEBUG nova.network.os_vif_util [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.956 187212 DEBUG nova.network.os_vif_util [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.962 187212 DEBUG nova.compute.manager [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.963 187212 DEBUG oslo_concurrency.lockutils [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.963 187212 DEBUG oslo_concurrency.lockutils [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.963 187212 DEBUG oslo_concurrency.lockutils [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.964 187212 DEBUG nova.compute.manager [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Processing event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.965 187212 DEBUG nova.compute.manager [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Received event network-vif-deleted-b66066cc-97eb-4896-a98d-267498dedf74 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.966 187212 DEBUG nova.compute.manager [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.966 187212 DEBUG oslo_concurrency.lockutils [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.966 187212 DEBUG oslo_concurrency.lockutils [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.966 187212 DEBUG oslo_concurrency.lockutils [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.967 187212 DEBUG nova.compute.manager [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] No waiting events found dispatching network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.967 187212 WARNING nova.compute.manager [req-d86246c5-40bf-4b3d-94a7-c4be68d65d3b req-78610b83-e8f5-4078-89f5-ca3f88fe47d1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received unexpected event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.968 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.971 187212 DEBUG nova.virt.libvirt.guest [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] attach device xml: <interface type="ethernet">
Dec  5 07:08:28 np0005546909 nova_compute[187208]:  <mac address="fa:16:3e:54:f6:34"/>
Dec  5 07:08:28 np0005546909 nova_compute[187208]:  <model type="virtio"/>
Dec  5 07:08:28 np0005546909 nova_compute[187208]:  <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:08:28 np0005546909 nova_compute[187208]:  <mtu size="1442"/>
Dec  5 07:08:28 np0005546909 nova_compute[187208]:  <target dev="tapaf04237a-1f"/>
Dec  5 07:08:28 np0005546909 nova_compute[187208]: </interface>
Dec  5 07:08:28 np0005546909 nova_compute[187208]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.975 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936508.9725707, b235a96f-7a12-4bd2-8627-33b128346aa4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.975 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.979 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:08:28 np0005546909 kernel: tapaf04237a-1f: entered promiscuous mode
Dec  5 07:08:28 np0005546909 NetworkManager[55691]: <info>  [1764936508.9860] manager: (tapaf04237a-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/255)
Dec  5 07:08:28 np0005546909 nova_compute[187208]: 2025-12-05 12:08:28.986 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:28 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:28Z|00623|binding|INFO|Claiming lport af04237a-1f79-4f68-a18e-1ceb4911605b for this chassis.
Dec  5 07:08:28 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:28Z|00624|binding|INFO|af04237a-1f79-4f68-a18e-1ceb4911605b: Claiming fa:16:3e:54:f6:34 10.100.0.10
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:28.997 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:f6:34 10.100.0.10'], port_security=['fa:16:3e:54:f6:34 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=af04237a-1f79-4f68-a18e-1ceb4911605b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:28.998 104471 INFO neutron.agent.ovn.metadata.agent [-] Port af04237a-1f79-4f68-a18e-1ceb4911605b in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.001 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.002 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:29 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:29Z|00625|binding|INFO|Setting lport af04237a-1f79-4f68-a18e-1ceb4911605b ovn-installed in OVS
Dec  5 07:08:29 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:29Z|00626|binding|INFO|Setting lport af04237a-1f79-4f68-a18e-1ceb4911605b up in Southbound
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.009 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.011 187212 INFO nova.virt.libvirt.driver [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Instance spawned successfully.#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.011 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.014 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.025 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.021 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cbaefc7a-aaa8-4cce-a688-f95fb3d3f28c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:29 np0005546909 systemd-udevd[229758]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:08:29 np0005546909 NetworkManager[55691]: <info>  [1764936509.0539] device (tapaf04237a-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:08:29 np0005546909 NetworkManager[55691]: <info>  [1764936509.0545] device (tapaf04237a-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.054 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.055 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.055 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.055 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.056 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.056 187212 DEBUG nova.virt.libvirt.driver [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.061 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[621d4b56-19b2-41a3-b17a-e1824a3d3efb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.064 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b653d1b9-7752-44a1-be63-a2def887c341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.079 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.093 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9d2caf07-adcf-4db2-979c-d5fe2348b753]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.111 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8517e25-df0c-40fb-bcc1-d78b1cbdae20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 15698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229764, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.123 187212 INFO nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Took 11.68 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.123 187212 DEBUG nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.125 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[130d708a-c998-4c7c-970b-701a127451ed]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383494, 'tstamp': 383494}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229765, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383497, 'tstamp': 383497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229765, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.127 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.128 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.130 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.130 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.131 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.131 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:29 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:29.132 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.134 187212 DEBUG nova.virt.libvirt.driver [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.134 187212 DEBUG nova.virt.libvirt.driver [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.134 187212 DEBUG nova.virt.libvirt.driver [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:01:99:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.135 187212 DEBUG nova.virt.libvirt.driver [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:b8:01:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.138 187212 DEBUG nova.virt.libvirt.driver [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:54:f6:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.190 187212 DEBUG nova.virt.libvirt.guest [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:08:29 np0005546909 nova_compute[187208]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:  <nova:creationTime>2025-12-05 12:08:29</nova:creationTime>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:  <nova:flavor name="m1.nano">
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    <nova:memory>128</nova:memory>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    <nova:disk>1</nova:disk>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    <nova:swap>0</nova:swap>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    <nova:vcpus>1</nova:vcpus>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:  </nova:flavor>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:  <nova:owner>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:  </nova:owner>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:  <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:  <nova:ports>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec  5 07:08:29 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    <nova:port uuid="d35fce09-856e-4ebf-b944-0c0953a9492b">
Dec  5 07:08:29 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    <nova:port uuid="af04237a-1f79-4f68-a18e-1ceb4911605b">
Dec  5 07:08:29 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:08:29 np0005546909 nova_compute[187208]:  </nova:ports>
Dec  5 07:08:29 np0005546909 nova_compute[187208]: </nova:instance>
Dec  5 07:08:29 np0005546909 nova_compute[187208]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.246 187212 DEBUG oslo_concurrency.lockutils [None req-4034fea7-55b2-4646-ab77-bd0dbbe89a32 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.279 187212 INFO nova.compute.manager [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Took 12.39 seconds to build instance.#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.329 187212 DEBUG oslo_concurrency.lockutils [None req-59d9530f-4574-441b-8728-f427306b5573 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.508s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.573 187212 DEBUG nova.network.neutron [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:29 np0005546909 nova_compute[187208]: 2025-12-05 12:08:29.594 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:30 np0005546909 podman[229767]: 2025-12-05 12:08:30.208665851 +0000 UTC m=+0.054256725 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:08:30 np0005546909 podman[229766]: 2025-12-05 12:08:30.225727176 +0000 UTC m=+0.075344207 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, release=1755695350, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec  5 07:08:30 np0005546909 nova_compute[187208]: 2025-12-05 12:08:30.261 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:08:31 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:31Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:54:f6:34 10.100.0.10
Dec  5 07:08:31 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:31Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:54:f6:34 10.100.0.10
Dec  5 07:08:32 np0005546909 nova_compute[187208]: 2025-12-05 12:08:32.003 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936497.0025473, 5659bd52-8c24-483d-80a4-8eb6b28e1349 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:32 np0005546909 nova_compute[187208]: 2025-12-05 12:08:32.004 187212 INFO nova.compute.manager [-] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:08:32 np0005546909 nova_compute[187208]: 2025-12-05 12:08:32.401 187212 DEBUG nova.compute.manager [None req-3ff18a4c-6715-4e1d-89df-9cc7019e3e42 - - - - - -] [instance: 5659bd52-8c24-483d-80a4-8eb6b28e1349] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:32 np0005546909 nova_compute[187208]: 2025-12-05 12:08:32.925 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.270 187212 DEBUG nova.compute.manager [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.270 187212 DEBUG oslo_concurrency.lockutils [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.270 187212 DEBUG oslo_concurrency.lockutils [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.271 187212 DEBUG oslo_concurrency.lockutils [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.271 187212 DEBUG nova.compute.manager [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.271 187212 WARNING nova.compute.manager [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b for instance with vm_state active and task_state None.#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.271 187212 DEBUG nova.compute.manager [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.271 187212 DEBUG oslo_concurrency.lockutils [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.272 187212 DEBUG oslo_concurrency.lockutils [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.272 187212 DEBUG oslo_concurrency.lockutils [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.272 187212 DEBUG nova.compute.manager [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.272 187212 WARNING nova.compute.manager [req-b1794228-de7c-4299-b37e-aedb5cf05eba req-3f12d053-4c48-432a-9be2-25745af975d0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b for instance with vm_state active and task_state None.#033[00m
Dec  5 07:08:33 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:33Z|00627|binding|INFO|Releasing lport 0dffa729-6b55-4e58-afef-f1cdc22c22fb from this chassis (sb_readonly=0)
Dec  5 07:08:33 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:33Z|00628|binding|INFO|Releasing lport d5a54702-8e08-4aa4-aef4-19a0cc66763a from this chassis (sb_readonly=0)
Dec  5 07:08:33 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:33Z|00629|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec  5 07:08:33 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:33Z|00630|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.449 187212 DEBUG nova.network.neutron [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updated VIF entry in instance network info cache for port af04237a-1f79-4f68-a18e-1ceb4911605b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.450 187212 DEBUG nova.network.neutron [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.467 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.479 187212 DEBUG oslo_concurrency.lockutils [req-b1986ada-f7d1-4059-b672-f41b2bb5513f req-98a9deaa-4509-4ef9-9323-b8b7c3ad235a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:33 np0005546909 nova_compute[187208]: 2025-12-05 12:08:33.939 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:34 np0005546909 nova_compute[187208]: 2025-12-05 12:08:34.285 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.237 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.238 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.238 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.238 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.238 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.240 187212 INFO nova.compute.manager [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Terminating instance#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.241 187212 DEBUG nova.compute.manager [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:08:35 np0005546909 kernel: tap48b30c48-78 (unregistering): left promiscuous mode
Dec  5 07:08:35 np0005546909 NetworkManager[55691]: <info>  [1764936515.2827] device (tap48b30c48-78): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:08:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:35Z|00631|binding|INFO|Releasing lport 48b30c48-7858-408b-aeab-df46f6277546 from this chassis (sb_readonly=0)
Dec  5 07:08:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:35Z|00632|binding|INFO|Setting lport 48b30c48-7858-408b-aeab-df46f6277546 down in Southbound
Dec  5 07:08:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:35Z|00633|binding|INFO|Removing iface tap48b30c48-78 ovn-installed in OVS
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.297 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.308 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:bb:58 10.100.0.8'], port_security=['fa:16:3e:62:bb:58 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd355bd0-560e-4b18-a504-3a5134c930f4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '442a804e3368417d9de1636d533a25e0', 'neutron:revision_number': '6', 'neutron:security_group_ids': '57e94004-ae40-473b-8b25-6fa2c9e8cf2d 994c2a79-1398-403d-88c3-e4993363396a fbf9a881-7958-4974-8ace-72447edf35a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67381b26-6b90-4d98-928b-9358d69f9e0c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=48b30c48-7858-408b-aeab-df46f6277546) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.309 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 48b30c48-7858-408b-aeab-df46f6277546 in datapath dd355bd0-560e-4b18-a504-3a5134c930f4 unbound from our chassis#033[00m
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.311 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd355bd0-560e-4b18-a504-3a5134c930f4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.314 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac3f7af-e35f-4656-80e9-2723304e8825]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.317 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4 namespace which is not needed anymore#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:35 np0005546909 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000003f.scope: Deactivated successfully.
Dec  5 07:08:35 np0005546909 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d0000003f.scope: Consumed 18.388s CPU time.
Dec  5 07:08:35 np0005546909 systemd-machined[153543]: Machine qemu-67-instance-0000003f terminated.
Dec  5 07:08:35 np0005546909 podman[229838]: 2025-12-05 12:08:35.368639224 +0000 UTC m=+0.065813061 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.381 187212 DEBUG oslo_concurrency.lockutils [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-08b15784-5374-4fb3-9f63-82412f709db4" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.382 187212 DEBUG oslo_concurrency.lockutils [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-08b15784-5374-4fb3-9f63-82412f709db4" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.383 187212 DEBUG nova.objects.instance [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:35 np0005546909 podman[229841]: 2025-12-05 12:08:35.405076201 +0000 UTC m=+0.088338164 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.451 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000047', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'bf30ed1956544c7eae67c989042126e4', 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'hostId': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:08:35 np0005546909 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [NOTICE]   (226767) : haproxy version is 2.8.14-c23fe91
Dec  5 07:08:35 np0005546909 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [NOTICE]   (226767) : path to executable is /usr/sbin/haproxy
Dec  5 07:08:35 np0005546909 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [WARNING]  (226767) : Exiting Master process...
Dec  5 07:08:35 np0005546909 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [WARNING]  (226767) : Exiting Master process...
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.458 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'name': 'tempest-ServerActionsTestOtherB-server-63085993', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000041', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '58cbd93e463049988ccd6d013893e7d6', 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'hostId': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:08:35 np0005546909 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [ALERT]    (226767) : Current worker (226769) exited with code 143 (Terminated)
Dec  5 07:08:35 np0005546909 neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4[226761]: [WARNING]  (226767) : All workers exited. Exiting... (0)
Dec  5 07:08:35 np0005546909 systemd[1]: libpod-3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3.scope: Deactivated successfully.
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.466 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:35 np0005546909 podman[229904]: 2025-12-05 12:08:35.467820692 +0000 UTC m=+0.056002546 container died 3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.480 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:35 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3-userdata-shm.mount: Deactivated successfully.
Dec  5 07:08:35 np0005546909 systemd[1]: var-lib-containers-storage-overlay-243fd1c5de96a14826aa2f40632d2c7e7d72bd7fcfdbb36dbcf9215a94d0a31f-merged.mount: Deactivated successfully.
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.510 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e9f9bf08-7688-4213-91ff-74f2271ec71d', 'name': 'tempest-SecurityGroupsTestJSON-server-1685847021', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000003f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '442a804e3368417d9de1636d533a25e0', 'user_id': '8db061f8c48141d1ac1c3216db1cc7f8', 'hostId': '14d16c9f1ccb5607b6b5f1aa93f1652eb8e59dac79369b13121c4e15', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.513 187212 INFO nova.virt.libvirt.driver [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Instance destroyed successfully.#033[00m
Dec  5 07:08:35 np0005546909 podman[229904]: 2025-12-05 12:08:35.513872858 +0000 UTC m=+0.102054712 container cleanup 3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.513 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000046', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e846fccb774e44f585d8847897bc4229', 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'hostId': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.514 187212 DEBUG nova.objects.instance [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lazy-loading 'resources' on Instance uuid e9f9bf08-7688-4213-91ff-74f2271ec71d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.516 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000043', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98681240c47b41cba28d91e1c11fd71f', 'user_id': '242b773b0af24caf814e2a84178332d5', 'hostId': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.518 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000036', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '58cbd93e463049988ccd6d013893e7d6', 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'hostId': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.518 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec  5 07:08:35 np0005546909 systemd[1]: libpod-conmon-3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3.scope: Deactivated successfully.
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.532 187212 DEBUG nova.virt.libvirt.vif [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:06:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-1685847021',display_name='tempest-SecurityGroupsTestJSON-server-1685847021',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-1685847021',id=63,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:06:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='442a804e3368417d9de1636d533a25e0',ramdisk_id='',reservation_id='r-52243xe8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-549628149',owner_user_name='tempest-SecurityGroupsTestJSON-549628149-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:06:34Z,user_data=None,user_id='8db061f8c48141d1ac1c3216db1cc7f8',uuid=e9f9bf08-7688-4213-91ff-74f2271ec71d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.532 187212 DEBUG nova.network.os_vif_util [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converting VIF {"id": "48b30c48-7858-408b-aeab-df46f6277546", "address": "fa:16:3e:62:bb:58", "network": {"id": "dd355bd0-560e-4b18-a504-3a5134c930f4", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1395271785-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "442a804e3368417d9de1636d533a25e0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap48b30c48-78", "ovs_interfaceid": "48b30c48-7858-408b-aeab-df46f6277546", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.533 187212 DEBUG nova.network.os_vif_util [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.533 187212 DEBUG os_vif [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.535 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.536 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48b30c48-78, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.539 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.542 187212 INFO os_vif [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:bb:58,bridge_name='br-int',has_traffic_filtering=True,id=48b30c48-7858-408b-aeab-df46f6277546,network=Network(dd355bd0-560e-4b18-a504-3a5134c930f4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap48b30c48-78')#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.542 187212 INFO nova.virt.libvirt.driver [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Deleting instance files /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d_del#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.543 187212 INFO nova.virt.libvirt.driver [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Deletion of /var/lib/nova/instances/e9f9bf08-7688-4213-91ff-74f2271ec71d_del complete#033[00m
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.558 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.559 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 podman[229948]: 2025-12-05 12:08:35.586428824 +0000 UTC m=+0.047210651 container remove 3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.587 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.588 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.589 12 DEBUG ceilometer.compute.pollsters [-] Instance e9f9bf08-7688-4213-91ff-74f2271ec71d was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.596 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ce1806fd-f66f-41e4-96fc-305c12657505]: (4, ('Fri Dec  5 12:08:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4 (3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3)\n3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3\nFri Dec  5 12:08:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4 (3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3)\n3a2363c6da39f306f742b1ff17269d68fecb58412664181154fe0f0e319f56f3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.600 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cd3ad137-9fc9-4892-a8b9-5dcbd06dda2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.602 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd355bd0-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:35 np0005546909 kernel: tapdd355bd0-50: left promiscuous mode
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.617 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.623 187212 INFO nova.compute.manager [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.624 187212 DEBUG oslo.service.loopingcall [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.624 187212 DEBUG nova.compute.manager [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:08:35 np0005546909 nova_compute[187208]: 2025-12-05 12:08:35.624 187212 DEBUG nova.network.neutron [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.628 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.write.bytes volume: 72695808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.628 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.631 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[13177d45-fa8b-4396-960b-56d16eccfb44]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.653 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[04b58368-8ba2-4ece-8498-7cbe0d1a446e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.655 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2bb817bd-dd57-40ad-8881-b7c674b6a788]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.671 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9470fe06-033b-403d-9b20-2595ecbd9799]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376688, 'reachable_time': 30255, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229963, 'error': None, 'target': 'ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:35 np0005546909 systemd[1]: run-netns-ovnmeta\x2ddd355bd0\x2d560e\x2d4b18\x2da504\x2d3a5134c930f4.mount: Deactivated successfully.
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.676 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dd355bd0-560e-4b18-a504-3a5134c930f4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:08:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:35.676 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[0d946dc6-48b5-4cb6-b42b-8212fa1f1d02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.683 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.write.bytes volume: 73019392 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.683 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.711 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.bytes volume: 73105408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.712 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb8c4880-5e08-48cc-839e-eadb52b740a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '201d729a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '05ce5119b35f1c15bb70015d35f675c20943ac38e5cf80130865442d6302287e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '201d7f38-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '3aa8da13f6537be734610b3d00dc2921f099831d6f3e9617ca06c0a3f381c053'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2021e53c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': 'cdb4184af8911c1a5542c1eec99f9823ffcda3da0be45878e44ae03ce76c5525'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2021efa0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '778c5a04b03139cd59cf0fe5f3d3dcc3ae64241e98ec4611cae116dfaabeedd4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72695808, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2028091c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '8d8e161b8695456bfeacc512e972dcaade5f950cb44253ae1483b96b2f349f33'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'i
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '20281498-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '7762c0161d44fe9d1305999163af85e9750b34b55f6fedfa77316f16557d8198'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73019392, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20306ec2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': 'bbeacdc1c1b0e83b93167431b711a16d24286dab58d99c959efe67e2669833f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '20307b4c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': '1c051f7bd09d67bec0a5abf502ad90ad0e675311b4a9b1980bab797ccc224975'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73105408, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2034c256-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '4e66922adfdad750824001caf1945d0587a03653a24e6dd7d7e009b201d979a3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.518698', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2034cf8a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '2cebb1a0dd1d1841251a31b13c309ac0bb9971bd520c5a822a442eabb59c3199'}]}, 'timestamp': '2025-12-05 12:08:35.712361', '_unique_id': '77b5c73fee9f49bb9b3b3dd3377796f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.714 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.718 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b235a96f-7a12-4bd2-8627-33b128346aa4 / tapdf4eecd2-b2 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.719 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.721 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 854e3893-3908-4b4a-b29c-7fb4384e4f0c / tap1b4ab157-dd inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.721 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.incoming.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.722 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.728 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2e537618-f998-4c4d-8e1e-e9cc79219330 / tap11c7fa90-6a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.728 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.incoming.packets volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.733 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f1e72d05-87e7-495d-9dbb-1a10b112c69f / tapf7a6775e-6d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.734 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f1e72d05-87e7-495d-9dbb-1a10b112c69f / tapd35fce09-85 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.734 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f1e72d05-87e7-495d-9dbb-1a10b112c69f / tapaf04237a-1f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.735 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets volume: 32 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.735 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.735 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.737 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.packets volume: 34 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d19fad0-6fd5-45bb-ac70-dff93151e07f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '2035e71c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'fee47db44f162a3fd08d596d7a7da0a9bd15309cc9cc58761439137ea80f0418'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '20363852-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '7650bdac89fdfe95a9ca31feb8adf3d90c788f43b59e262eb3d3373f64389259'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 6, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '20375dcc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': 'ff58d63d451aa2a6aea7ecd244a2521212558ab1433273cec29e43b41b4e2ef8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 32, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '203854a2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'e7b7b9d594a828eb42e56c55333d138255e3df2bc1215e6c97d9c0e52951bba8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '20385f38-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '0e78845ca2b91f4041690b7f9bb60230c99b79c7cd9bb4a3d93056d25e43d2e1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9d
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '20386758-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'c38f797055d222468e7d6a1bd0477b8fe880f865345f42f162f568e8caf73a99'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 34, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.714884', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '2038bd5c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': '8352f53fc8d6862c0549cd85ee8fab362a3de0ec4a26b683636f094674a7c6c2'}]}, 'timestamp': '2025-12-05 12:08:35.738134', '_unique_id': '8492777c5f844d69be0bc228e331b9d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.739 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.740 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.740 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>]
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.740 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.740 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.740 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.741 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.741 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.742 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.742 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.742 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.743 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '684fa6c5-8297-4948-bcde-04cf6db486b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.740572', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '20392b48-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'ad1af126ea2fc9873f86bfb456f70274b7e13298888d39c7f4e917e70cfe9f36'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.740572', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '20393840-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': 'a1c3000ab6f020c599d36fef60df1ee4ac76d5e3a370c1a6ee8133a5663e01b9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.740572', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '20395fc8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': 'eb91623c92f4fa75292aed4f68de5727e9d1f2d7b1a283a99b83b6b893e643f6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.740572', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '20396bda-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '19e36f7c7da8813e5065095a9e2b1806c11e286b67045fa3e7e5f2492351b9ee'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.740572', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '20397742-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '2c78b419613ff1f67308bfb72e4b4ac5f634ff2c9f0567c2df73914c832e25be'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instanc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 35.740572', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '2039828c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '15dce7b4e31b4b1d6f045064b5145f2095712f63e6ff46939cf934a684593742'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.740572', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '20398eee-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': 'e4006fb7bfa503ba471a6f283fdaab4d05eb2a395d4212cc5311b725ec284013'}]}, 'timestamp': '2025-12-05 12:08:35.743459', '_unique_id': 'f9c2a1c40a394b70bcbe506f45760e59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.744 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.763 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/cpu volume: 6460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.713 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:35 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.739 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:35 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.744 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.800 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/cpu volume: 11640000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.802 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.817 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/cpu volume: 11750000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.834 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/cpu volume: 11840000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.849 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/cpu volume: 12410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef8736be-1abc-444c-bcfa-be4358f61eb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6460000000, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'timestamp': '2025-12-05T12:08:35.744867', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '203cab92-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.384245744, 'message_signature': 'e93ab0bd6cfadb4de6f6f58576f0d3b11233ae6c19c59fc87d32b5985197b768'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11640000000, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'timestamp': '2025-12-05T12:08:35.744867', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '204251dc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.421296659, 'message_signature': '1cc5efe385103b4e43035e373c362774b0201907338d7cacb7164ef298890f97'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11750000000, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'timestamp': '2025-12-05T12:08:35.744867', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2044ffe0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.438771116, 'message_signature': '0ed32c2ee2dcfb7421c7ce31f2a6b9c213fad6ef526909cab2588c44d801e4e9'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11840000000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'timestamp': '2025-12-05T12:08:35.744867', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '20478756-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.455383268, 'message_signature': 'b4c82fcb10738963592b05ad7ff1a3824922fcb36782a8b76da6cca44bc57ae2'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12410000000, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'timestamp': '2025-12-05T12:08:35.744867', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2049d646-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.47061859, 'message_signature': 'b3ff0d3e0e75285aa1014e19563464cb66b0a88674e5a38594fff850f39a9476'}]}, 'timestamp': '2025-12-05 12:08:35.850313', '_unique_id': '0cd0abfbdb7c41e4a9ff95d34c4f0c21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.851 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.852 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.852 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.852 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.853 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.853 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.853 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.854 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.854 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.854 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56f2ab59-ec64-445d-8489-d9d0fe0f68cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.852407', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '204a3af0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'f9c283d1980d97f2ba9e9db6ada088314ced6e1c1e2ec2d49111a92242ec5d1c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.852407', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '204a441e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '06399d1f669ba9bc57eccd4f5738767c8d68c86e520c88ed0e4e702d2b1085c8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.852407', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '204a666a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': '8f22784105ec603a24910391a628597212ef4df43abb4c61504686a61f2945a6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.852407', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '204a6ebc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '79248c156ec8964ae54c3c511d267e3d9ffeecf82e6282127b31a0874d32f1d8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.852407', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '204a7ca4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'bf0a14baf20553b5ec460431995f1be90fd75428710e85fc641ce833c7ce3679'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 407', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '204a8780-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'ca7ceb7123b4d4815b4cea0e4bcc8c8a0e488f46b1b3541763dc5f72a4d56e94'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.852407', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '204a91da-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': '112eb29c08536e7732860c4f8cfc807c64e9529c9ea332d92925ac5986a25d85'}]}, 'timestamp': '2025-12-05 12:08:35.854927', '_unique_id': '3723bffae24345ed991a5fbd1b99bcca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.856 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.856 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.read.latency volume: 171140698 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.856 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.read.latency volume: 581557 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.857 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.read.latency volume: 175918589 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.857 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.read.latency volume: 18556500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.858 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.858 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.read.latency volume: 220016827 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.858 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.read.latency volume: 27449958 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.859 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.read.latency volume: 437318709 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.859 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.read.latency volume: 27936399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.860 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.latency volume: 227447368 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.860 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.latency volume: 33644734 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0da2d582-4f3e-40c6-8f84-1ea8704be4b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 171140698, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204ae2b6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '48d0bd905c78517fda7a8d275cab62bc19e210c806f91c74a172fb9880fe1b8d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 581557, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204aee5a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '66eb2e2d42804a7920d2a8bc223343aa1620f89abe6c91c14a0c81d00a6dbdfe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 175918589, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204af850-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '9c7a42f37aed46bc0673069f64d1a5fbbd9564a1298c524781036fda334cb6da'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18556500, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204b01e2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '91933af07e9b3aaf4cbef7f065057a3c1d8775822ae61dd185921c550040ccdc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 220016827, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204b2a6e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': 'd991d3cc90d90a55a7847003cd93bc73bba4804b182bedfa123adc3c4437bfd6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27449958, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': '
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: ': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204b3392-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '9dfe42c9ad59b8b605138fd3e6f6e813eca9be246731e8b40bd50080695f8915'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 437318709, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204b3d38-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': 'dd69db352f1997eb9eb68126ee2cad6e698203eb621e93fb11ef72b592ec6209'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27936399, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204b633a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': '0150a2b2d56408697c93ad7f9f1f2285b0ca5fe89b20bec99315dfce5eec6562'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 227447368, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204b7460-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '2b1d378e9b884b882bf5c781e5c5ea7ae07bc41c5cc3627eb5dfd7283339da06'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 33644734, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.856709', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204b7c76-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': 'efcbd0729d7831da5bfe630800050c1fa5d463864ae71efd9069baf8d6e0f284'}]}, 'timestamp': '2025-12-05 12:08:35.860906', '_unique_id': '7146063ded1d47dd9bb4f70cf3235dd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.862 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.875 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.875 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.888 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.888 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.889 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.898 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.899 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.910 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.911 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.922 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.922 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79d023e4-f7dd-4af2-946d-d2f9a2b938a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204dc49a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.484021949, 'message_signature': '92652fdd6e1cc11a801964655f929d22359fdf6c3eae6714c6f637b813829ff2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204dd2a0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.484021949, 'message_signature': 'd1ee9675c2f2047665e7ca41987050a5c649745ddd8a896b1b81928da07e766b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '204fbc28-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.497278904, 'message_signature': '1adac6ec33343fb0b8982e381b7353606d8f53004e3b32c7939e5c21dfd85dcc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '204fc830-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.497278904, 'message_signature': 'b62fea3cc7bd5c0b25669c8c564c5ab964178925558ca02234d1799684c41ba3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20515bfa-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.510750485, 'message_signature': 'cbbe61c7b289ae1487bdd5a5fc7e73dacc34e9aa138aac6a17a7d72e778fdd7a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64',
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 'disk_name': 'sda'}, 'message_id': '205167d0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.510750485, 'message_signature': 'c6f0e2bc36a206294f2ebd9c161fca9053b7abd685319ebb7a7fea185950dc6b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20533178-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.520750035, 'message_signature': '603113f12d9024a24a7a58811110ed0004407db8e02c9e9b49a6ccbd2d11d86e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2053412c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.520750035, 'message_signature': '58ec954ce9ac1cebba4f3d1d8b3847be9c91168e6625731636cab3fdf6179340'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2054ef04-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.532889788, 'message_signature': 'ee38ed3c6485eb9ce7538ff885ffdc5e61e5d43b1ac49ec78ce814909675c436'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.862932', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2054fefe-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.532889788, 'message_signature': '9ecdd67f6991a70735ea5c187e8e66e83fdf2c9917d1ad18c2c7324dc65a060a'}]}, 'timestamp': '2025-12-05 12:08:35.923281', '_unique_id': 'b1c184d9c202449c95fc8eaec021659b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.925 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.925 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.925 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.926 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.926 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.927 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.928 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.928 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.928 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.928 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.929 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.929 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04286a0d-52dc-4cc0-8c14-5d96855cf353', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20556736-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.484021949, 'message_signature': 'f59782f43fb77a4ef85e71fff0bf05ece2bbdf90f90960d7a26d742f67ed1790'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205572bc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.484021949, 'message_signature': 'ec99e3110d61c4cd918aed58bade52dabdaa87d57696ca054f38e898ccaab1a3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20557e10-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.497278904, 'message_signature': '9acbd53a2456f715d9dce89498efbe3c355929f72dabe73fe0450071a38da7a5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2055884c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.497278904, 'message_signature': '72285ce06828400754c2b30c5b1b358187046ede9a857e5cb22847ad55218f0c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2055c9e2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.510750485, 'message_signature': '74540af1f3ca813e05c212029cae5e0aed51c2ff3c4bdcdedf2b253b126dfb3c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: ral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2055d400-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.510750485, 'message_signature': '07a44bfa568a74087e52955ea718199fb42d6402f22fd2eeb1d880054616f34f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2055dd7e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.520750035, 'message_signature': 'fbe6c3f59a441a4fe15b33b75b1e0a134d0c8dbc1d114275dbee6c58d8438e9b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2055e5f8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.520750035, 'message_signature': '7284fcb8448c66e7efd087dda5df246788e600dbf54e3c1ebd370b53aeacb7f5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2055f020-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.532889788, 'message_signature': 'b9ce7650f177220b91b2b725b814307ad47ea147e4f08c374a5525c476c9aa67'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.925643', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2055f926-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.532889788, 'message_signature': '90470fe4264809190868691f32b92dd46a0c702791d72b9dc5600f08f70c0189'}]}, 'timestamp': '2025-12-05 12:08:35.929636', '_unique_id': 'd6f92181b81d4177a2fb968c33478cfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.931 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.931 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.932 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>]
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.932 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.932 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.932 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.932 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.write.latency volume: 2084425918 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.933 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.934 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.934 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.write.latency volume: 4916924415 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.934 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.935 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.write.latency volume: 3430287673 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.935 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.935 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.latency volume: 5304399030 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.935 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58abde05-b4cb-4413-a588-a3e332eed4fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20566cee-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '1369d420eb253d8392cf4c73bb1f9697cb1131b3994c0fea05c81f5fb4615bab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2056791e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': 'f9050353c1c5447f38d42b8b9b266ab0ecebe0506ca1218d3442c79679c6754c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2084425918, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20568468-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': 'dd24d851a33264abf124d0da7eed6fb19b80236b7646d2b01378c39f23f5dae8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '20568eae-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '3244ca63c622fb4efd93365c7cd3b914d62660748a3e244daaeb7ce875c4571f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4916924415, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2056c752-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '42b2adafda3796ebf21b481569438a79366f2ac56ddc46048ff1b0c55904f30a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2056d2c4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': 'f0798bfeb17a6b605fed10cbd8e1a5b1ffe7fb9795cc2e6085b332949d9f3e2a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3430287673, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2056dd1e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': 'eea674d79e02c3c92517a94538cdb469911e9b4cd01abdee14e2a6c90aa10366'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2056e7a0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': '24b86b97a654f4245bac81bae5715f4df9704a84feb534e04a2fc1b7dee33dbf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5304399030, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2056f0e2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '0a45b017384bf5d5027f773acfc56649187e128b86acbb16087bb45b4cd628e1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.932317', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205720bc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '1030c451b7804b01ac3c1efb249357bbd23b339c4b675ac3b143e7a90accede0'}]}, 'timestamp': '2025-12-05 12:08:35.937433', '_unique_id': 'de7bef9a548c43699ded393a4874595d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.940 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.940 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.940 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.942 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.942 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.943 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.943 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.943 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.943 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.bytes.delta volume: 336 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '267e36dd-81fc-4788-8704-a6e8bc056d12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '2057b0f4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': '971b4475ebffc5e1c472af3e9cd8d33bbea60e3c3eb93eba8bf425ee937b4335'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '2057bd38-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': 'a664120d8af0734f28f344f2509adae6543b108c94837f7d6c5b9024177f07bc'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '205805ea-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': '2142aa7d0503c58b5a91aee9ff022885d13ded73871b39988f363f2830a4e38d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '205815d0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '3c2b1f839ea7dfaa45325c3ffd0a1eb4d1ea049d2ed3bc71c354b27b932ad69d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '20581f12-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'ab15cfafa366d8099a2e7232d9cb674db5fbbb28331e08632301cd8013ed785d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f', 'timest
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: cesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '2058287c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '77bb4fdd8860f27a1abf3b261c8fec205e540a9cd41e4ad1ffb2c9717786bc17'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 336, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.940500', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '20583326-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': '22535c6c6f5cd684cdd8554d4cd60cbfa85b47c2b0d357031474c61f2fd96650'}]}, 'timestamp': '2025-12-05 12:08:35.944281', '_unique_id': 'a2af0191fe2e4dd49a37d478def0556b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.946 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.946 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.947 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.947 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.write.requests volume: 328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.947 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.948 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.948 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.write.requests volume: 301 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.949 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.949 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.949 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.949 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.requests volume: 331 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.949 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0e6b3ef-2e84-484f-a91e-21464dd84cdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2058a6f8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': 'd2d70ad0041fe27eebcf42a64b0ee1b16acc5436c55182091f81e8ee4c61cb26'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2058b2c4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '9fb1058908573a1a77250c2850d00cd02779bc7f0ef39608db9cf3315ed25920'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 328, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2058bbd4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '509486d564bf737ad45e63554ed5cd16d49d14a0620a494a77a6d01b633ef78b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2058c502-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '9ae2d3ec74e2fb7ffacf13b11d9906c1b19e957081782903346b831054953f77'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 301, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2058f0a4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '1701c5906a0c49da91a2569597c4bcc4c73b66fb4bed09bb4aa8dfd9ce6e1ac4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: ype': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2058faea-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '7bf52aecf7d0f2c1fc1bae683b2a311e21ff294a4c5d2a448ae4a17fae09495e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20590288-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': 'eb1ac22815b449aa8f18d861bcfd21b2c7c9bf7ccf0c9fa8c9420516caaf0068'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205909e0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': '21c2c74e246b7228e25947699e01373a1fc840524508492d0153e63f78ab6792'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 331, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2059135e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '070e19d9813ea5234bd6d760f6810df19a27a3fbee764a0e2827a6e7d946fab9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.946873', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '20591c0a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '93e717662e00d3b9adf0db352a766db14ad643603230426d90f2be99501cb8dd'}]}, 'timestamp': '2025-12-05 12:08:35.950186', '_unique_id': 'b9448f6c49ce4f87a2d0883a2235e0fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.952 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.952 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.952 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.953 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.read.bytes volume: 29936128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.953 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.954 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.954 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.read.bytes volume: 30329344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.955 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.955 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.read.bytes volume: 30759424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.955 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.955 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.bytes volume: 30104064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.955 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2dafa48-713c-47d7-bee7-e96133a0d018', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2059851e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '35de9dc9562f0e7382b0b6d71a1793af53ee8f848db09627ff817388d93f4a40'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205991f8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '0e954dfc42a65855e0bff8f460764b91dc1c55422c0d737b3935626725824a62'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29936128, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20599cd4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '332f3d3a1ae17de52ab362b901fb9d3fb9229e6fa318793742cc894c802a2666'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2059a7ba-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': 'c974c083f834b369d57909248254314af6eb2c3fd844c60ec5a9e16cb5dc3db4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30329344, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2059da3c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': 'd85a3dcb9be9a48daa1165c58d878068dd50751fa4962d1d5fb5d62b100db564'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b3
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: ry_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2059e5cc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '068edc69447c3c99b61675e7330bd60d5c0be5994900c1b32c797ed36e7ce84f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30759424, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2059ed9c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': '2b2a9aef1dbeda1390f8aabb59aae4f63593e95b79c35a765b915f65bf60c677'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2059f4ea-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': 'a34d97ca2c13ea596343fc6be1f7dc9ab9b49fffc30677eabfc0060b6249b06a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30104064, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2059fc56-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': 'c422954a43d226c8c1c1a2e78a055c2f69371900bb07384129f873adb656011c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.952576', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205a039a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': 'bf0dcc722b4da9ad6a0ad8446911b01987b5001eeff1b823b88a7cf29918f20e'}]}, 'timestamp': '2025-12-05 12:08:35.956138', '_unique_id': '7b201022e817404b9a3994a28a92505e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.958 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.958 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.958 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.959 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.960 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.960 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.960 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.960 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.961 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0790796b-5252-4eab-91cb-fd0b4ad48ccf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.958330', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '205a6704-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'd2cf64aeb8c5c7633ba9547e6340bbaf25def0e409c069314f6479d736e3b64c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.958330', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '205a7320-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '192258da666b7b1d29133d7a0f6aaa6daab73a1a9c0f45a8a4b7cd08a4580d9d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.958330', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '205aa660-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': 'a95cfb0768f494fb3aed5dbe0272bd47fd71a7b3525d55eaf84eb8ab8a48a0d8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.958330', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '205ab0ba-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'aabd79c1a173e35cc32a2360cddfcf6fd5365d972296ba4230eb7e2a93dfe5c2'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.958330', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '205aba10-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'c1c4e6e9dd37d69e84480309cba23f12f9150ff7bb8d667ef9def6abce837523'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instanc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 35.958330', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '205ac370-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '1724bb5978c7a383b5cdb3e61e73cce15369eb354c22ee372e164e57c6010295'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.958330', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '205acde8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': 'ef4e98ff93bfbd694cde8ab8279fe9002343e628f01e1afb5c7e0b0a012c44e4'}]}, 'timestamp': '2025-12-05 12:08:35.961331', '_unique_id': '19f8ef7eb173439f9710c538e4c426a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.963 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.963 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.963 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>]
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.964 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.964 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.964 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.965 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.966 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.outgoing.bytes volume: 1200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.966 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.966 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.966 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.bytes volume: 1326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.967 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b0ec32b-d5d5-4a0d-9208-e7412f7a909e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '205b4ade-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': '574ada1abe9f51229118763463cbaaaf4076d155adeb445d5f30ea0a8dc1f0fb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '205b5632-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '612dd2e38eb6ffd1ecb689269ffd61510c26f2283529256d3237d23945b6dbb1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1200, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '205b9368-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': 'df09b6e9d306f0124b4101ede5785fc5282812225331af07dc1fe8bceada39ba'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '205b9ffc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '1589adabc2ef5bb5ec90ec3a47dd986c42f7ca71c7616a077fba504ca75f658b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '205baa7e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '7813e7de478ff46b4c5cbdc3b40fb1b731e8e1e7ab0066ef7d7f716f36f60cf0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1326, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: chInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '205bb69a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '042b05d07848fbf1de813ac97a4b87baf31160b3df51ba801bc5a95d32b4efec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.964224', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '205bc112-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': 'd4b8aa1c4f3150ca766a6395e61630273acc548732263351e88c7416669a96f9'}]}, 'timestamp': '2025-12-05 12:08:35.967568', '_unique_id': '759b6577b8bd4a0ea287dd271c2599c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.969 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.970 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.970 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.971 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.971 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.971 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.972 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.972 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.972 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'adf4f7b3-9be8-406e-a73f-22afad0cf83c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.970136', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '205c320a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'df92b9c81b2d9badb310091f08d890b7b0e749bec11069092b5c1348cc6bc0f7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.970136', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '205c3e62-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': 'f49a18cc7586efe503938f351b10395c8138f918b57cf87fdc6759b8b05597a0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.970136', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '205c670c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': '9e252fab699c90c422b937a6c9fcd02739889e0446c985824a810e09c3b183c4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.970136', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '205c71de-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '8347b558746a4dd282e0fbaff7313bcb9563e7f364d1d7ec133899c8aabbfa5c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.970136', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '205c7d6e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '22a033789af5b904890514ed34a073963ef9e39d600f113818048063929931c6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 136', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '205c87d2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'ccf2975fc094577103fc8cc89e1cd2b1d58778e61ba6ca471598a406151f6135'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.970136', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '205c9254-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': '6d40919750b9d2cf2d708d21862354ca76ffcbd7e31ed378cac8ad9656104c46'}]}, 'timestamp': '2025-12-05 12:08:35.972920', '_unique_id': '80c8856510b24661bcd86802c0b0fc15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.975 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.975 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.975 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.975 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.read.requests volume: 1074 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.976 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.977 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.977 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.read.requests volume: 1094 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.977 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.977 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.read.requests volume: 1111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.978 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.978 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.requests volume: 1087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.978 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a8f07f6-13f7-4036-9643-5a9ba7ced4d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205cfc08-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': 'b669f3d2848f8e8d3719fae78ff0d9a2865ab0667241a0c90304f18a56bd6cfe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205d061c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.139732379, 'message_signature': '99911f554e89d012dd1a8307fab2034bb13588e684a253b33cfed396d753761c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1074, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205d0f40-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': 'c864d285e42cc563b5600b165684654a82b1ea53e4b85ba0c0334dc1783cf6e5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205d19fe-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.180531333, 'message_signature': '997d7f7322e4677b26988a9f7cb117cdb9358c574d6f26ec9c8bd6c627d51ead'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1094, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205d4adc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '3e14b7c441e3c121da6309a36c5aecf1536e020168da300cbca767c792cd10c6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_re
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: _type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205d54a0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.210573964, 'message_signature': '83d166e19e2ccfc7b83ff5f16a6068f082cb7b27660e3326c28d69f7b8604c85'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1111, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205d5e28-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': 'd7f3c55e30f867a065831382445a2037be5c916623922971ddb35277aeb678b8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205d6df0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.249940337, 'message_signature': '699e0cf9cec708635e0769c755e5f7a4b6e71ccc7ae288213b0587b0fb0e7699'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1087, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205d793a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': 'ed29f945406ed6e3f24a0eb3b7ec3c2ed4f8d4023acb030d3ab8fa689481e82e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.975290', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205d829a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.304953043, 'message_signature': '6c5bf30981946e5806d2396508ec938fe52b9a20af598012f2061ecea0581b90'}]}, 'timestamp': '2025-12-05 12:08:35.979080', '_unique_id': '2143db660d7b40a0b17a869adf270ee9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.981 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.981 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.981 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-959694714>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-63085993>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-1436335913>, <NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1099990882>]
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.981 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.982 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.982 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.982 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.983 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.984 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.984 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.984 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.985 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.985 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.985 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.985 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba73126b-a954-411a-82e8-60b1d010d68e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-vda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205e0e54-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.484021949, 'message_signature': '5011df8c6adfd1c420bc0d0969327ff0715110167741fd6d79b06dab2d1c8aee'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4-sda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'instance-00000047', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205e1ab6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.484021949, 'message_signature': '741270a0c279cc60dbf919aea9cb0ffaed05d7334c91082199c3fc5a4f56ed87'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-vda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205e2538-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.497278904, 'message_signature': 'f08dfae0e5e54b88bf8e9fd9cccaf748bd12620ea14a021c30fe8fd53fef6f4e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c-sda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205e33b6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.497278904, 'message_signature': 'd992f150d429946fef18d21ad61eb98c68de807f399a4a0007e3099470e04243'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-vda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205e68d6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.510750485, 'message_signature': '87a74484e968a10538e982b8bc675374fb808b61e1b9d63aada9f0d9cb81920e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330-sda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': N
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: hemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205e73d0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.510750485, 'message_signature': 'c72c8ba2939b708a38e95ef53b1c7ac8f6e121fb4533611cb7ae692a3a60bcac'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-vda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205e7d44-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.520750035, 'message_signature': '490166e199e878f4f2104aa48cf3ae43792a41669495580db642a023d459be02'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f-sda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205e8654-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.520750035, 'message_signature': '2bfcd6db82821a3d1ccc855b34979fcd759c2fc04df6f7fd5e97c8e0dc417075'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-vda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '205e9004-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.532889788, 'message_signature': '5c2b1c974cdef080e5a4de4c4be5460b0619cf1e631d14e8c83940c2056bac56'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54-sda', 'timestamp': '2025-12-05T12:08:35.982287', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '205e9b1c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.532889788, 'message_signature': 'f66664f5c35e34c257144634e918fcf7706cb3dd6de8075d738b2543fc3c562c'}]}, 'timestamp': '2025-12-05 12:08:35.986251', '_unique_id': '22f74bf530c34cd09c45f8659f2149cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.988 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.988 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.988 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance b235a96f-7a12-4bd2-8627-33b128346aa4: ceilometer.compute.pollsters.NoVolumeException
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.989 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/memory.usage volume: 42.27734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.990 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.990 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.990 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/memory.usage volume: 44.1328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.990 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/memory.usage volume: 42.59765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efa74d28-4b72-4c89-940d-56fd02f2da77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.27734375, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'timestamp': '2025-12-05T12:08:35.988879', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'instance-00000041', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '205f1826-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.421296659, 'message_signature': '216d89c1585e5c1aff73768c8f6fbf64fed1ccca34fa6066d8c317c478a1b876'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'timestamp': '2025-12-05T12:08:35.988879', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'instance-00000046', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '205f4bc0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.438771116, 'message_signature': '60ab1233c983f133ba2f4f14c812034982a66cee15c222225f2127743a804c6e'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 44.1328125, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'timestamp': '2025-12-05T12:08:35.988879', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'instance-00000043', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '205f55ac-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.455383268, 'message_signature': '2688011807e02f9dec6d304e80e2446fbd09de606d1149c38f2ff1c1f884641b'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.59765625, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'timestamp': '2025-12-05T12:08:35.988879', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'instance-00000036', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '205f5fd4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.47061859, 'message_signature': '4f17be869f03b2133ffd16c549f569ba0e8f379cfef090b5e31779ad031e179c'}]}, 'timestamp': '2025-12-05 12:08:35.991266', '_unique_id': '0e7943bad0a14df19eee7701771f7fdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.992 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.993 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.993 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.993 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.994 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.994 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.995 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.995 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.995 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.995 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4c0c452-d841-412e-9c0e-217b9902f0d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '205fbb00-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'd91ffbc6ddb3c08d5d4154548cdc9d13a96d735ae84b0bdd059a58ff1e50e339'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '205fc686-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '90d30b7a924313553fc2c252b4cea96df66130235cb08e501c364a88b9a1c63c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '205ff16a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': 'eeb56b26c46c0332f81297033af7057213c1808966e8395feca3790b1968d2a1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '205ffe80-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'ba0c1cee425abbf57f08fd31b53e55955a5c194dc384d1d3400ceb3c56a95061'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '2060090c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'fad65b0f0d71bcdb342bd7f019a0f4d99988d98a4f17f4e96aef39f30551c215'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f', 'timest
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: cesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '20601352-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '91da91625f745b556188f54e38fffa9182760ef139ae308e326416dbafbddf0f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.993295', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '20601db6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': '751ab3db8877f86fa692b43e0d0fd5e92be0240b48fe967eed72f79edc009ec5'}]}, 'timestamp': '2025-12-05 12:08:35.996168', '_unique_id': 'cddf595005d34c5694baeb1462c6e04d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.998 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.998 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.998 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.999 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.999 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.outgoing.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.999 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:35.999 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.000 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.000 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c26dd6a-b2c9-42ad-9285-6579de3bd5b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '20607be4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': '7a06ae9dfc0dcb1af5f6e801cc7e26a14274f50ee8abe1cc864951e400c68692'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '206086a2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '3f32cdd01a0e8c3bad1662f1f92153785286100b0464107e4ca67d8c7f2cecad'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '2060aa56-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': '2075f4b56a3dbfda0d71173ff0357ac91b1d5fb824eb78f79727c9fba508a7d4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '2060b4c4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '24e0fe6c185ac8eb0181284d1e2131e6f35cb8320407f9109c22a919490a3854'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '2060bfdc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'a416d1207db9b4c0a3c6afef0370c034930f7001c42f47d0c21664618a250a3c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9d
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '2060c9be-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '43665c3fe8afb1c17b5e3756bacae5779c5458614b575fde2c961b138abe4608'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:35.998269', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '2060d418-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': '05848860ebcb94e989e57091f2b188b0c0ebcc639e24a939e8d0c6223e56f565'}]}, 'timestamp': '2025-12-05 12:08:36.000802', '_unique_id': 'a2a21b98096a4f5d874b033382fe0b33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.002 12 DEBUG ceilometer.compute.pollsters [-] b235a96f-7a12-4bd2-8627-33b128346aa4/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.002 12 DEBUG ceilometer.compute.pollsters [-] 854e3893-3908-4b4a-b29c-7fb4384e4f0c/network.incoming.bytes volume: 1514 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d'
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.003 12 DEBUG ceilometer.compute.pollsters [-] Exception while getting samples Error from libvirt while looking up instance <name=instance-0000003f, id=e9f9bf08-7688-4213-91ff-74f2271ec71d>: [Error Code 42] Domain not found: no domain with matching uuid 'e9f9bf08-7688-4213-91ff-74f2271ec71d' get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:150
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.003 12 DEBUG ceilometer.compute.pollsters [-] 2e537618-f998-4c4d-8e1e-e9cc79219330/network.incoming.bytes volume: 532 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.004 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.bytes volume: 4447 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.004 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.bytes volume: 1430 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.004 12 DEBUG ceilometer.compute.pollsters [-] f1e72d05-87e7-495d-9dbb-1a10b112c69f/network.incoming.bytes volume: 1346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.004 12 DEBUG ceilometer.compute.pollsters [-] 24358eea-14fb-4863-a6c4-aadcdb495f54/network.incoming.bytes volume: 4531 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4433084-a900-4741-b5d7-47eea4b354a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '132d581de02e49b9a4c99b9b831dd5b5', 'user_name': None, 'project_id': 'bf30ed1956544c7eae67c989042126e4', 'project_name': None, 'resource_id': 'instance-00000047-b235a96f-7a12-4bd2-8627-33b128346aa4-tapdf4eecd2-b2', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-959694714', 'name': 'tapdf4eecd2-b2', 'instance_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'instance_type': 'm1.nano', 'host': '769c9e1b6c1aee885daf651de25d0d17da0de6132c3c15165f51afac', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:40:3b:49', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapdf4eecd2-b2'}, 'message_id': '2061274c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.335945892, 'message_signature': 'fe0b1aa6fa68533275b4e3174802b4f61a923cc0eec1cd76da31f850de5d8fa4'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1514, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000041-854e3893-3908-4b4a-b29c-7fb4384e4f0c-tap1b4ab157-dd', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-63085993', 'name': 'tap1b4ab157-dd', 'instance_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:e5:0a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b4ab157-dd'}, 'message_id': '2061328c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.340513655, 'message_signature': '4481ee7984dc8efdb1a24f5ca762f7bed7f1ffe916abf47d8576ca6e176bfbf5'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 532, 'user_id': '6a2cefdbcaae4db3b3ece95c8227d77e', 'user_name': None, 'project_id': 'e846fccb774e44f585d8847897bc4229', 'project_name': None, 'resource_id': 'instance-00000046-2e537618-f998-4c4d-8e1e-e9cc79219330-tap11c7fa90-6a', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-ServerRescueTestJSONUnderV235-server-1436335913', 'name': 'tap11c7fa90-6a', 'instance_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'instance_type': 'm1.nano', 'host': '655a41ad63529d03a1924532c4771f7046a6652dd038f74dc2152cf9', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e4:ee:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap11c7fa90-6a'}, 'message_id': '2061519a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.343368888, 'message_signature': '4c48751a996f5842690ad3048be2af1627a010ccf21402d481014076e841e75a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4447, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapf7a6775e-6d', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapf7a6775e-6d', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:01:99:b0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7a6775e-6d'}, 'message_id': '20615d20-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'dd7865b7c3700ed2fd68616fb60574b10f4ec8982735251dbbfc86a8c40a4f60'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1430, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapd35fce09-85', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1099990882', 'name': 'tapd35fce09-85', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b8:01:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd35fce09-85'}, 'message_id': '2061672a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': '3145e2675519b223dc27630afb0e272709b4de1cb5208ee5eb9b987bb96cf0d1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1346, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-00000043-f1e72d05-87e7-495d-9dbb-1a10b112c69f-tapaf04237a-1f'
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: chInterfacesTestJSON-server-1099990882', 'name': 'tapaf04237a-1f', 'instance_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:54:f6:34', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaf04237a-1f'}, 'message_id': '2061713e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.350172845, 'message_signature': 'c99a09c46f3943103555977f1cc9d529fed04688226d2c43275f4ecbf914ceee'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4531, 'user_id': '4ad1281afc874c0ca55d908d3a6e05a8', 'user_name': None, 'project_id': '58cbd93e463049988ccd6d013893e7d6', 'project_name': None, 'resource_id': 'instance-00000036-24358eea-14fb-4863-a6c4-aadcdb495f54-tap2e9efd6c-74', 'timestamp': '2025-12-05T12:08:36.002643', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1629320086', 'name': 'tap2e9efd6c-74', 'instance_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'instance_type': 'm1.nano', 'host': '2e2354155eefbbd1aeb1be13d979954e1a61cae06325031c060c2815', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ab:5e:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2e9efd6c-74'}, 'message_id': '20617c74-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 3896.35686831, 'message_signature': 'c43ffa0cbaa4887153978df74d5fdcddb7dc60e4c57dee55bce7366ab0a6c4be'}]}, 'timestamp': '2025-12-05 12:08:36.005135', '_unique_id': '1e724de2ee974b3ca320dcb37a537b9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:08:36 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:08:36 np0005546909 nova_compute[187208]: 2025-12-05 12:08:36.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:08:36 np0005546909 nova_compute[187208]: 2025-12-05 12:08:36.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.855 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.862 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.924 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.930 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.939 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.945 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.951 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.957 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.962 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.968 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.974 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.980 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.987 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:35.997 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:36.001 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 12:08:36.006 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec  5 07:08:36 np0005546909 nova_compute[187208]: 2025-12-05 12:08:36.338 187212 DEBUG nova.objects.instance [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_requests' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:36 np0005546909 nova_compute[187208]: 2025-12-05 12:08:36.356 187212 DEBUG nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.015 187212 DEBUG nova.policy [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.082 187212 DEBUG nova.network.neutron [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.097 187212 INFO nova.compute.manager [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Took 1.47 seconds to deallocate network for instance.#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.144 187212 DEBUG nova.compute.manager [req-e81468d9-2d71-43e8-8e4f-c00467961bbb req-604a7dff-f3ca-409e-8c23-66c31ccde568 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-vif-unplugged-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.145 187212 DEBUG oslo_concurrency.lockutils [req-e81468d9-2d71-43e8-8e4f-c00467961bbb req-604a7dff-f3ca-409e-8c23-66c31ccde568 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.145 187212 DEBUG oslo_concurrency.lockutils [req-e81468d9-2d71-43e8-8e4f-c00467961bbb req-604a7dff-f3ca-409e-8c23-66c31ccde568 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.145 187212 DEBUG oslo_concurrency.lockutils [req-e81468d9-2d71-43e8-8e4f-c00467961bbb req-604a7dff-f3ca-409e-8c23-66c31ccde568 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.145 187212 DEBUG nova.compute.manager [req-e81468d9-2d71-43e8-8e4f-c00467961bbb req-604a7dff-f3ca-409e-8c23-66c31ccde568 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] No waiting events found dispatching network-vif-unplugged-48b30c48-7858-408b-aeab-df46f6277546 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.146 187212 DEBUG nova.compute.manager [req-e81468d9-2d71-43e8-8e4f-c00467961bbb req-604a7dff-f3ca-409e-8c23-66c31ccde568 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-vif-unplugged-48b30c48-7858-408b-aeab-df46f6277546 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.147 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.147 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.413 187212 DEBUG nova.compute.provider_tree [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.431 187212 DEBUG nova.scheduler.client.report [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.454 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.493 187212 INFO nova.scheduler.client.report [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Deleted allocations for instance e9f9bf08-7688-4213-91ff-74f2271ec71d#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.578 187212 DEBUG oslo_concurrency.lockutils [None req-c1d07dfb-c526-483c-abca-7cf4ad65490a 8db061f8c48141d1ac1c3216db1cc7f8 442a804e3368417d9de1636d533a25e0 - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.733 187212 DEBUG nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Successfully updated port: 08b15784-5374-4fb3-9f63-82412f709db4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.768 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936502.7674022, 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.769 187212 INFO nova.compute.manager [-] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.880 187212 DEBUG nova.compute.manager [None req-4cb947f3-3cf3-4528-88b6-811cf3053d24 - - - - - -] [instance: 3b3ab1ca-8cbe-45d7-b28b-bacd1c0fdcbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.883 187212 DEBUG oslo_concurrency.lockutils [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.883 187212 DEBUG oslo_concurrency.lockutils [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.883 187212 DEBUG nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:08:37 np0005546909 nova_compute[187208]: 2025-12-05 12:08:37.928 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:38 np0005546909 nova_compute[187208]: 2025-12-05 12:08:38.134 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:38 np0005546909 nova_compute[187208]: 2025-12-05 12:08:38.135 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:38 np0005546909 nova_compute[187208]: 2025-12-05 12:08:38.135 187212 INFO nova.compute.manager [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Shelving#033[00m
Dec  5 07:08:38 np0005546909 nova_compute[187208]: 2025-12-05 12:08:38.156 187212 DEBUG nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:08:38 np0005546909 nova_compute[187208]: 2025-12-05 12:08:38.218 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:38 np0005546909 nova_compute[187208]: 2025-12-05 12:08:38.247 187212 WARNING nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it#033[00m
Dec  5 07:08:38 np0005546909 nova_compute[187208]: 2025-12-05 12:08:38.247 187212 WARNING nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it#033[00m
Dec  5 07:08:38 np0005546909 nova_compute[187208]: 2025-12-05 12:08:38.247 187212 WARNING nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] fbfed6fc-3701-4311-a4c2-8c49c5b7584c already exists in list: networks containing: ['fbfed6fc-3701-4311-a4c2-8c49c5b7584c']. ignoring it#033[00m
Dec  5 07:08:39 np0005546909 podman[229969]: 2025-12-05 12:08:39.217052908 +0000 UTC m=+0.063834443 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.305 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Dec  5 07:08:40 np0005546909 kernel: tap2e9efd6c-74 (unregistering): left promiscuous mode
Dec  5 07:08:40 np0005546909 NetworkManager[55691]: <info>  [1764936520.4103] device (tap2e9efd6c-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00634|binding|INFO|Releasing lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 from this chassis (sb_readonly=0)
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00635|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 down in Southbound
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.420 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00636|binding|INFO|Removing iface tap2e9efd6c-74 ovn-installed in OVS
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.422 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.428 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:5e:ef 10.100.0.5'], port_security=['fa:16:3e:ab:5e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'def128bf-31aa-408f-b463-573b7d555296', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2e9efd6c-740c-405b-b9f0-bd46434070a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.429 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2e9efd6c-740c-405b-b9f0-bd46434070a7 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 unbound from our chassis#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.431 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.433 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.449 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[834d9d90-b444-4de5-900e-e561485d8934]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.479 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[53c7de55-198e-4eef-9a7a-bff5ab5e1112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.483 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[68eec4d1-8d1a-455d-b455-67716c03913d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000036.scope: Deactivated successfully.
Dec  5 07:08:40 np0005546909 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000036.scope: Consumed 20.652s CPU time.
Dec  5 07:08:40 np0005546909 systemd-machined[153543]: Machine qemu-58-instance-00000036 terminated.
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.512 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c0f10650-8575-47e9-8711-ec7af9ac216d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.531 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ef876a1f-349e-4ce2-ad7f-71212ee58877]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 40806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230002, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.538 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.548 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4791c766-b010-4514-8b24-dc5e9014286c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230003, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230003, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.550 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.552 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.556 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.557 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.558 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.558 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.558 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:40 np0005546909 kernel: tap2e9efd6c-74: entered promiscuous mode
Dec  5 07:08:40 np0005546909 NetworkManager[55691]: <info>  [1764936520.6411] manager: (tap2e9efd6c-74): new Tun device (/org/freedesktop/NetworkManager/Devices/256)
Dec  5 07:08:40 np0005546909 systemd-udevd[229993]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:08:40 np0005546909 kernel: tap2e9efd6c-74 (unregistering): left promiscuous mode
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00637|binding|INFO|Claiming lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 for this chassis.
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.641 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00638|binding|INFO|2e9efd6c-740c-405b-b9f0-bd46434070a7: Claiming fa:16:3e:ab:5e:ef 10.100.0.5
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.654 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:5e:ef 10.100.0.5'], port_security=['fa:16:3e:ab:5e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'def128bf-31aa-408f-b463-573b7d555296', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2e9efd6c-740c-405b-b9f0-bd46434070a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.655 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2e9efd6c-740c-405b-b9f0-bd46434070a7 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 bound to our chassis#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.658 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363#033[00m
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00639|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 ovn-installed in OVS
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00640|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 up in Southbound
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.665 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00641|binding|INFO|Releasing lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 from this chassis (sb_readonly=1)
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.666 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00642|if_status|INFO|Dropped 2 log messages in last 127 seconds (most recently, 127 seconds ago) due to excessive rate
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00643|if_status|INFO|Not setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 down as sb is readonly
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00644|binding|INFO|Removing iface tap2e9efd6c-74 ovn-installed in OVS
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00645|binding|INFO|Releasing lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 from this chassis (sb_readonly=0)
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.670 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:40Z|00646|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 down in Southbound
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.672 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b04e03cb-6b39-4e1b-be8e-fe97b97bc772]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.678 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.681 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:5e:ef 10.100.0.5'], port_security=['fa:16:3e:ab:5e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'def128bf-31aa-408f-b463-573b7d555296', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2e9efd6c-740c-405b-b9f0-bd46434070a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.701 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e52e01-c6e8-4b20-9b83-acc814b03003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.704 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[08618253-02b7-439f-a0b1-73ddc9714ba1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.734 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6d1807-61ab-4856-aa80-43d16fb5085d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.755 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2c110524-8b21-464e-b0c4-a9de111a2f3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 40806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230028, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.776 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c995eccd-ec0a-4a0d-a60a-633f91d1adbe]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230029, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230029, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.776 187212 DEBUG nova.compute.manager [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.777 187212 DEBUG oslo_concurrency.lockutils [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.777 187212 DEBUG oslo_concurrency.lockutils [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.777 187212 DEBUG oslo_concurrency.lockutils [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "e9f9bf08-7688-4213-91ff-74f2271ec71d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.777 187212 DEBUG nova.compute.manager [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] No waiting events found dispatching network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.778 187212 WARNING nova.compute.manager [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received unexpected event network-vif-plugged-48b30c48-7858-408b-aeab-df46f6277546 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.778 187212 DEBUG nova.compute.manager [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Received event network-vif-deleted-48b30c48-7858-408b-aeab-df46f6277546 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.778 187212 DEBUG nova.compute.manager [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-changed-08b15784-5374-4fb3-9f63-82412f709db4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.778 187212 DEBUG nova.compute.manager [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing instance network info cache due to event network-changed-08b15784-5374-4fb3-9f63-82412f709db4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.778 187212 DEBUG oslo_concurrency.lockutils [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.778 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.780 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.785 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.785 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.786 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.786 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.787 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2e9efd6c-740c-405b-b9f0-bd46434070a7 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 unbound from our chassis#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.790 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.806 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3f6450fd-1b47-41af-8ccf-2aebbdf5744b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.837 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b10c3be7-6027-4df4-bea0-d50b39f3a5b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.840 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ec4cdf-b4f7-49a7-a3e9-928ea84649ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.874 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c6aeebcf-ab64-45a4-9671-62d260e19938]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.892 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c57083f-917d-4b7a-a1f4-aa772b7ed735]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 15, 'rx_bytes': 700, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 40806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230036, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.911 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4825dded-fc87-4a5f-9724-47b4ab2bff9a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230037, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230037, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.914 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.915 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 nova_compute[187208]: 2025-12-05 12:08:40.919 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.920 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.920 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.921 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:40.921 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:41 np0005546909 nova_compute[187208]: 2025-12-05 12:08:41.074 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:08:41 np0005546909 nova_compute[187208]: 2025-12-05 12:08:41.075 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:08:41 np0005546909 nova_compute[187208]: 2025-12-05 12:08:41.099 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:08:41 np0005546909 nova_compute[187208]: 2025-12-05 12:08:41.100 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:08:41 np0005546909 nova_compute[187208]: 2025-12-05 12:08:41.172 187212 INFO nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance shutdown successfully after 3 seconds.#033[00m
Dec  5 07:08:41 np0005546909 nova_compute[187208]: 2025-12-05 12:08:41.180 187212 INFO nova.virt.libvirt.driver [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance destroyed successfully.#033[00m
Dec  5 07:08:41 np0005546909 nova_compute[187208]: 2025-12-05 12:08:41.180 187212 DEBUG nova.objects.instance [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'numa_topology' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:41 np0005546909 nova_compute[187208]: 2025-12-05 12:08:41.535 187212 INFO nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Beginning cold snapshot process#033[00m
Dec  5 07:08:41 np0005546909 nova_compute[187208]: 2025-12-05 12:08:41.695 187212 DEBUG nova.privsep.utils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:08:41 np0005546909 nova_compute[187208]: 2025-12-05 12:08:41.696 187212 DEBUG oslo_concurrency.processutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk /var/lib/nova/instances/snapshots/tmpg0u6kc0n/89f598cb738c44249d2624dce0df0c86 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:42 np0005546909 nova_compute[187208]: 2025-12-05 12:08:42.063 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:08:42 np0005546909 nova_compute[187208]: 2025-12-05 12:08:42.100 187212 DEBUG oslo_concurrency.processutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk /var/lib/nova/instances/snapshots/tmpg0u6kc0n/89f598cb738c44249d2624dce0df0c86" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:42 np0005546909 nova_compute[187208]: 2025-12-05 12:08:42.100 187212 INFO nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot extracted, beginning image upload#033[00m
Dec  5 07:08:42 np0005546909 nova_compute[187208]: 2025-12-05 12:08:42.665 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:42 np0005546909 kernel: tap11c7fa90-6a (unregistering): left promiscuous mode
Dec  5 07:08:42 np0005546909 NetworkManager[55691]: <info>  [1764936522.7469] device (tap11c7fa90-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:08:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:42Z|00647|binding|INFO|Releasing lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 from this chassis (sb_readonly=0)
Dec  5 07:08:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:42Z|00648|binding|INFO|Setting lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 down in Southbound
Dec  5 07:08:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:42Z|00649|binding|INFO|Removing iface tap11c7fa90-6a ovn-installed in OVS
Dec  5 07:08:42 np0005546909 nova_compute[187208]: 2025-12-05 12:08:42.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:42 np0005546909 nova_compute[187208]: 2025-12-05 12:08:42.768 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:42 np0005546909 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000046.scope: Deactivated successfully.
Dec  5 07:08:42 np0005546909 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000046.scope: Consumed 13.788s CPU time.
Dec  5 07:08:42 np0005546909 systemd-machined[153543]: Machine qemu-77-instance-00000046 terminated.
Dec  5 07:08:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:42.837 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:ee:e4 10.100.0.2'], port_security=['fa:16:3e:e4:ee:e4 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-034629ef-6cd1-463c-b963-3d0d9c530038', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e846fccb774e44f585d8847897bc4229', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a77f7593-d6d1-44fb-8125-66cdfc38709c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4c9d894-0fc3-4aad-a4d5-6bee101a530c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=11c7fa90-6a48-487a-a375-5adf7f41cb90) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:42.838 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 11c7fa90-6a48-487a-a375-5adf7f41cb90 in datapath 034629ef-6cd1-463c-b963-3d0d9c530038 unbound from our chassis#033[00m
Dec  5 07:08:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:42.840 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 034629ef-6cd1-463c-b963-3d0d9c530038 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Dec  5 07:08:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:42.841 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2293cec9-e60a-4a6b-9cf1-8f162a56ca62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:42Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:40:3b:49 10.100.0.11
Dec  5 07:08:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:42Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:40:3b:49 10.100.0.11
Dec  5 07:08:42 np0005546909 nova_compute[187208]: 2025-12-05 12:08:42.930 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.321 187212 INFO nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance shutdown successfully after 13 seconds.#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.326 187212 INFO nova.virt.libvirt.driver [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance destroyed successfully.#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.327 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.467 187212 INFO nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Attempting rescue#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.469 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.473 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.473 187212 INFO nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Creating image(s)#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.474 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.474 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.475 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.475 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.499 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.500 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.510 187212 DEBUG oslo_concurrency.processutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.567 187212 DEBUG oslo_concurrency.processutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.568 187212 DEBUG oslo_concurrency.processutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.604 187212 DEBUG oslo_concurrency.processutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.606 187212 DEBUG oslo_concurrency.lockutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.607 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'migration_context' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.626 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.628 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Start _get_guest_xml network_info=[{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "vif_mac": "fa:16:3e:e4:ee:e4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': 'a6987852-063f-405d-a848-6b382694811e', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.629 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'resources' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.665 187212 WARNING nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.678 187212 DEBUG nova.virt.libvirt.host [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.679 187212 DEBUG nova.virt.libvirt.host [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.684 187212 DEBUG nova.virt.libvirt.host [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.685 187212 DEBUG nova.virt.libvirt.host [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.686 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.686 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.686 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.687 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.687 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.687 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.687 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.687 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.688 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.688 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.688 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.688 187212 DEBUG nova.virt.hardware [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.689 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.707 187212 DEBUG nova.virt.libvirt.vif [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1436335913',display_name='tempest-ServerRescueTestJSONUnderV235-server-1436335913',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1436335913',id=70,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:08:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e846fccb774e44f585d8847897bc4229',ramdisk_id='',reservation_id='r-230fx5t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1035500959',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1035500959-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:08:20Z,user_data=None,user_id='6a2cefdbcaae4db3b3ece95c8227d77e',uuid=2e537618-f998-4c4d-8e1e-e9cc79219330,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "vif_mac": "fa:16:3e:e4:ee:e4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.707 187212 DEBUG nova.network.os_vif_util [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converting VIF {"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "vif_mac": "fa:16:3e:e4:ee:e4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.708 187212 DEBUG nova.network.os_vif_util [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.709 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.740 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  <uuid>2e537618-f998-4c4d-8e1e-e9cc79219330</uuid>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  <name>instance-00000046</name>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerRescueTestJSONUnderV235-server-1436335913</nova:name>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:08:43</nova:creationTime>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:        <nova:user uuid="6a2cefdbcaae4db3b3ece95c8227d77e">tempest-ServerRescueTestJSONUnderV235-1035500959-project-member</nova:user>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:        <nova:project uuid="e846fccb774e44f585d8847897bc4229">tempest-ServerRescueTestJSONUnderV235-1035500959</nova:project>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:        <nova:port uuid="11c7fa90-6a48-487a-a375-5adf7f41cb90">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.2" ipVersion="4"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <entry name="serial">2e537618-f998-4c4d-8e1e-e9cc79219330</entry>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <entry name="uuid">2e537618-f998-4c4d-8e1e-e9cc79219330</entry>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <target dev="vdb" bus="virtio"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config.rescue"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:e4:ee:e4"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <target dev="tap11c7fa90-6a"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/console.log" append="off"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:08:43 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:08:43 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:08:43 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:08:43 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.748 187212 INFO nova.virt.libvirt.driver [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance destroyed successfully.#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.827 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.828 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.828 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.828 187212 DEBUG nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] No VIF found with MAC fa:16:3e:e4:ee:e4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.829 187212 INFO nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Using config drive#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.849 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:43 np0005546909 nova_compute[187208]: 2025-12-05 12:08:43.908 187212 DEBUG nova.objects.instance [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'keypairs' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.096 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.098 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.098 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.220 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.279 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.281 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.342 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.349 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.391 187212 INFO nova.virt.libvirt.driver [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Creating config drive at /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config.rescue#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.398 187212 DEBUG oslo_concurrency.processutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphwzmm4kj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.421 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.422 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.505 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.510 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.530 187212 DEBUG oslo_concurrency.processutils [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphwzmm4kj" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.577 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.577 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:44 np0005546909 kernel: tap11c7fa90-6a: entered promiscuous mode
Dec  5 07:08:44 np0005546909 NetworkManager[55691]: <info>  [1764936524.5950] manager: (tap11c7fa90-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/257)
Dec  5 07:08:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:44Z|00650|binding|INFO|Claiming lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 for this chassis.
Dec  5 07:08:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:44Z|00651|binding|INFO|11c7fa90-6a48-487a-a375-5adf7f41cb90: Claiming fa:16:3e:e4:ee:e4 10.100.0.2
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.599 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:44Z|00652|binding|INFO|Setting lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 ovn-installed in OVS
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.618 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:44 np0005546909 systemd-udevd[230119]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.624 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:44 np0005546909 systemd-machined[153543]: New machine qemu-79-instance-00000046.
Dec  5 07:08:44 np0005546909 NetworkManager[55691]: <info>  [1764936524.6326] device (tap11c7fa90-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:08:44 np0005546909 NetworkManager[55691]: <info>  [1764936524.6334] device (tap11c7fa90-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:08:44 np0005546909 systemd[1]: Started Virtual Machine qemu-79-instance-00000046.
Dec  5 07:08:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:44.639 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:ee:e4 10.100.0.2'], port_security=['fa:16:3e:e4:ee:e4 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-034629ef-6cd1-463c-b963-3d0d9c530038', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e846fccb774e44f585d8847897bc4229', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a77f7593-d6d1-44fb-8125-66cdfc38709c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4c9d894-0fc3-4aad-a4d5-6bee101a530c, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=11c7fa90-6a48-487a-a375-5adf7f41cb90) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:44Z|00653|binding|INFO|Setting lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 up in Southbound
Dec  5 07:08:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:44.640 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 11c7fa90-6a48-487a-a375-5adf7f41cb90 in datapath 034629ef-6cd1-463c-b963-3d0d9c530038 bound to our chassis#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.640 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk.rescue --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.641 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:44.642 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 034629ef-6cd1-463c-b963-3d0d9c530038 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Dec  5 07:08:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:44.643 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4e60e849-3ad1-4ab6-b53e-b2ae9c101880]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.662 187212 DEBUG nova.network.neutron [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.700 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.700 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.760 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.769 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.826 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.827 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.897 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.904 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.972 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:44 np0005546909 nova_compute[187208]: 2025-12-05 12:08:44.973 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:08:45 np0005546909 nova_compute[187208]: 2025-12-05 12:08:45.029 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:08:45 np0005546909 nova_compute[187208]: 2025-12-05 12:08:45.238 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:08:45 np0005546909 nova_compute[187208]: 2025-12-05 12:08:45.240 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5072MB free_disk=72.9872817993164GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:08:45 np0005546909 nova_compute[187208]: 2025-12-05 12:08:45.241 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:45 np0005546909 nova_compute[187208]: 2025-12-05 12:08:45.241 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:45 np0005546909 nova_compute[187208]: 2025-12-05 12:08:45.251 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 2e537618-f998-4c4d-8e1e-e9cc79219330 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:08:45 np0005546909 nova_compute[187208]: 2025-12-05 12:08:45.253 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936525.250553, 2e537618-f998-4c4d-8e1e-e9cc79219330 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:45 np0005546909 nova_compute[187208]: 2025-12-05 12:08:45.253 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:08:45 np0005546909 nova_compute[187208]: 2025-12-05 12:08:45.259 187212 DEBUG nova.compute.manager [None req-ae109859-9a4c-4876-9f37-c3b0bb75303f 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:45 np0005546909 nova_compute[187208]: 2025-12-05 12:08:45.539 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:46 np0005546909 podman[230159]: 2025-12-05 12:08:46.204443017 +0000 UTC m=+0.058894937 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.036 187212 DEBUG oslo_concurrency.lockutils [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.037 187212 DEBUG oslo_concurrency.lockutils [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.037 187212 DEBUG nova.network.neutron [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Refreshing network info cache for port 08b15784-5374-4fb3-9f63-82412f709db4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.041 187212 DEBUG nova.virt.libvirt.vif [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.042 187212 DEBUG nova.network.os_vif_util [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.043 187212 DEBUG nova.network.os_vif_util [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.043 187212 DEBUG os_vif [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.044 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.044 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.044 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.049 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.049 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.050 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08b15784-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.051 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap08b15784-53, col_values=(('external_ids', {'iface-id': '08b15784-5374-4fb3-9f63-82412f709db4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:21:db', 'vm-uuid': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.052 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:47 np0005546909 NetworkManager[55691]: <info>  [1764936527.0536] manager: (tap08b15784-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.055 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.063 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.065 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.066 187212 INFO os_vif [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53')#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.067 187212 DEBUG nova.virt.libvirt.vif [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.067 187212 DEBUG nova.network.os_vif_util [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.068 187212 DEBUG nova.network.os_vif_util [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.070 187212 DEBUG nova.virt.libvirt.guest [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] attach device xml: <interface type="ethernet">
Dec  5 07:08:47 np0005546909 nova_compute[187208]:  <mac address="fa:16:3e:d1:21:db"/>
Dec  5 07:08:47 np0005546909 nova_compute[187208]:  <model type="virtio"/>
Dec  5 07:08:47 np0005546909 nova_compute[187208]:  <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:08:47 np0005546909 nova_compute[187208]:  <mtu size="1442"/>
Dec  5 07:08:47 np0005546909 nova_compute[187208]:  <target dev="tap08b15784-53"/>
Dec  5 07:08:47 np0005546909 nova_compute[187208]: </interface>
Dec  5 07:08:47 np0005546909 nova_compute[187208]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Dec  5 07:08:47 np0005546909 kernel: tap08b15784-53: entered promiscuous mode
Dec  5 07:08:47 np0005546909 NetworkManager[55691]: <info>  [1764936527.0821] manager: (tap08b15784-53): new Tun device (/org/freedesktop/NetworkManager/Devices/259)
Dec  5 07:08:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:47Z|00654|binding|INFO|Claiming lport 08b15784-5374-4fb3-9f63-82412f709db4 for this chassis.
Dec  5 07:08:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:47Z|00655|binding|INFO|08b15784-5374-4fb3-9f63-82412f709db4: Claiming fa:16:3e:d1:21:db 10.100.0.14
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.089 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:47 np0005546909 NetworkManager[55691]: <info>  [1764936527.1026] device (tap08b15784-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:08:47 np0005546909 NetworkManager[55691]: <info>  [1764936527.1041] device (tap08b15784-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:08:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:47Z|00656|binding|INFO|Setting lport 08b15784-5374-4fb3-9f63-82412f709db4 ovn-installed in OVS
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:47Z|00657|binding|INFO|Setting lport 08b15784-5374-4fb3-9f63-82412f709db4 up in Southbound
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.737 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:21:db 10.100.0.14'], port_security=['fa:16:3e:d1:21:db 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-274660127', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-274660127', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=08b15784-5374-4fb3-9f63-82412f709db4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.738 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 08b15784-5374-4fb3-9f63-82412f709db4 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.740 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.754 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3d438086-9f38-4fce-9c45-85146370195b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.783 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c21076cf-5751-4e8a-94c8-c85aa2cbc623]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.786 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[7b54973e-e10c-4438-a847-b44fc03a7e12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.797 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936525.2527544, 2e537618-f998-4c4d-8e1e-e9cc79219330 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.797 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] VM Started (Lifecycle Event)#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.816 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2d44534f-bb66-49c0-b90f-5c0a85fdfd27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.829 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 24358eea-14fb-4863-a6c4-aadcdb495f54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.830 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 854e3893-3908-4b4a-b29c-7fb4384e4f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.830 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance f1e72d05-87e7-495d-9dbb-1a10b112c69f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.830 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 2e537618-f998-4c4d-8e1e-e9cc79219330 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.830 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance b235a96f-7a12-4bd2-8627-33b128346aa4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.830 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.831 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1152MB phys_disk=79GB used_disk=5GB total_vcpus=8 used_vcpus=5 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.832 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2a4702-c894-4a37-8900-9f9448d5765a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 10, 'rx_bytes': 784, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 10, 'rx_bytes': 784, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 15698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230201, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.846 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8c389e3a-e388-4609-ae74-269697792674]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383494, 'tstamp': 383494}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230202, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383497, 'tstamp': 383497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230202, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.848 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.899 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.902 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.903 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.903 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.904 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:47 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:47.904 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:08:47 np0005546909 nova_compute[187208]: 2025-12-05 12:08:47.932 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:48 np0005546909 nova_compute[187208]: 2025-12-05 12:08:48.096 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:08:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:48Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d1:21:db 10.100.0.14
Dec  5 07:08:48 np0005546909 ovn_controller[95610]: 2025-12-05T12:08:48Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d1:21:db 10.100.0.14
Dec  5 07:08:48 np0005546909 nova_compute[187208]: 2025-12-05 12:08:48.790 187212 DEBUG nova.virt.libvirt.driver [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:48.792 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:08:48 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:48.793 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:08:48 np0005546909 nova_compute[187208]: 2025-12-05 12:08:48.793 187212 DEBUG nova.virt.libvirt.driver [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:08:48 np0005546909 nova_compute[187208]: 2025-12-05 12:08:48.794 187212 DEBUG nova.virt.libvirt.driver [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:01:99:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:48 np0005546909 nova_compute[187208]: 2025-12-05 12:08:48.794 187212 DEBUG nova.virt.libvirt.driver [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:b8:01:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:48 np0005546909 nova_compute[187208]: 2025-12-05 12:08:48.794 187212 DEBUG nova.virt.libvirt.driver [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:54:f6:34, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:48 np0005546909 nova_compute[187208]: 2025-12-05 12:08:48.795 187212 DEBUG nova.virt.libvirt.driver [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:d1:21:db, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:08:48 np0005546909 nova_compute[187208]: 2025-12-05 12:08:48.798 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:48 np0005546909 nova_compute[187208]: 2025-12-05 12:08:48.828 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:48 np0005546909 nova_compute[187208]: 2025-12-05 12:08:48.829 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:08:48 np0005546909 nova_compute[187208]: 2025-12-05 12:08:48.834 187212 DEBUG nova.virt.libvirt.guest [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:08:48 np0005546909 nova_compute[187208]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:  <nova:creationTime>2025-12-05 12:08:48</nova:creationTime>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:  <nova:flavor name="m1.nano">
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    <nova:memory>128</nova:memory>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    <nova:disk>1</nova:disk>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    <nova:swap>0</nova:swap>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    <nova:vcpus>1</nova:vcpus>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:  </nova:flavor>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:  <nova:owner>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:  </nova:owner>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:  <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:  <nova:ports>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec  5 07:08:48 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    <nova:port uuid="d35fce09-856e-4ebf-b944-0c0953a9492b">
Dec  5 07:08:48 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    <nova:port uuid="af04237a-1f79-4f68-a18e-1ceb4911605b">
Dec  5 07:08:48 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    <nova:port uuid="08b15784-5374-4fb3-9f63-82412f709db4">
Dec  5 07:08:48 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:08:48 np0005546909 nova_compute[187208]:  </nova:ports>
Dec  5 07:08:48 np0005546909 nova_compute[187208]: </nova:instance>
Dec  5 07:08:48 np0005546909 nova_compute[187208]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec  5 07:08:48 np0005546909 nova_compute[187208]: 2025-12-05 12:08:48.838 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:08:49 np0005546909 nova_compute[187208]: 2025-12-05 12:08:49.221 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:08:49 np0005546909 nova_compute[187208]: 2025-12-05 12:08:49.222 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.981s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:49 np0005546909 nova_compute[187208]: 2025-12-05 12:08:49.232 187212 DEBUG oslo_concurrency.lockutils [None req-c66026dd-50ca-4228-8116-54358dddee38 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-08b15784-5374-4fb3-9f63-82412f709db4" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 13.850s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:50 np0005546909 nova_compute[187208]: 2025-12-05 12:08:50.513 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936515.5109434, e9f9bf08-7688-4213-91ff-74f2271ec71d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:50 np0005546909 nova_compute[187208]: 2025-12-05 12:08:50.514 187212 INFO nova.compute.manager [-] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:08:50 np0005546909 nova_compute[187208]: 2025-12-05 12:08:50.532 187212 DEBUG nova.compute.manager [None req-f26b3fca-2f1d-423e-8432-748c57f477f2 - - - - - -] [instance: e9f9bf08-7688-4213-91ff-74f2271ec71d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:50 np0005546909 nova_compute[187208]: 2025-12-05 12:08:50.874 187212 DEBUG nova.compute.manager [req-26fbbd29-bc23-4691-abf4-0d308000dbe3 req-76ec6703-17a8-45aa-8ca6-dc405e65e70b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:50 np0005546909 nova_compute[187208]: 2025-12-05 12:08:50.875 187212 DEBUG oslo_concurrency.lockutils [req-26fbbd29-bc23-4691-abf4-0d308000dbe3 req-76ec6703-17a8-45aa-8ca6-dc405e65e70b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:50 np0005546909 nova_compute[187208]: 2025-12-05 12:08:50.875 187212 DEBUG oslo_concurrency.lockutils [req-26fbbd29-bc23-4691-abf4-0d308000dbe3 req-76ec6703-17a8-45aa-8ca6-dc405e65e70b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:50 np0005546909 nova_compute[187208]: 2025-12-05 12:08:50.875 187212 DEBUG oslo_concurrency.lockutils [req-26fbbd29-bc23-4691-abf4-0d308000dbe3 req-76ec6703-17a8-45aa-8ca6-dc405e65e70b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:50 np0005546909 nova_compute[187208]: 2025-12-05 12:08:50.876 187212 DEBUG nova.compute.manager [req-26fbbd29-bc23-4691-abf4-0d308000dbe3 req-76ec6703-17a8-45aa-8ca6-dc405e65e70b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:50 np0005546909 nova_compute[187208]: 2025-12-05 12:08:50.876 187212 WARNING nova.compute.manager [req-26fbbd29-bc23-4691-abf4-0d308000dbe3 req-76ec6703-17a8-45aa-8ca6-dc405e65e70b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Dec  5 07:08:51 np0005546909 nova_compute[187208]: 2025-12-05 12:08:51.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:08:51 np0005546909 nova_compute[187208]: 2025-12-05 12:08:51.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:08:51 np0005546909 nova_compute[187208]: 2025-12-05 12:08:51.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:08:51 np0005546909 nova_compute[187208]: 2025-12-05 12:08:51.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 07:08:51 np0005546909 nova_compute[187208]: 2025-12-05 12:08:51.075 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 07:08:51 np0005546909 nova_compute[187208]: 2025-12-05 12:08:51.282 187212 DEBUG nova.network.neutron [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updated VIF entry in instance network info cache for port 08b15784-5374-4fb3-9f63-82412f709db4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:08:51 np0005546909 nova_compute[187208]: 2025-12-05 12:08:51.283 187212 DEBUG nova.network.neutron [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:51 np0005546909 nova_compute[187208]: 2025-12-05 12:08:51.300 187212 DEBUG oslo_concurrency.lockutils [req-7e281bc1-49c6-41f2-b0ad-0da3569777a1 req-50db0f5d-5c77-4e5e-a1ed-a842ee2dfd46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:51 np0005546909 nova_compute[187208]: 2025-12-05 12:08:51.832 187212 INFO nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Snapshot image upload complete#033[00m
Dec  5 07:08:51 np0005546909 nova_compute[187208]: 2025-12-05 12:08:51.833 187212 DEBUG nova.compute.manager [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:52 np0005546909 nova_compute[187208]: 2025-12-05 12:08:52.052 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:52 np0005546909 nova_compute[187208]: 2025-12-05 12:08:52.098 187212 INFO nova.compute.manager [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Shelve offloading#033[00m
Dec  5 07:08:52 np0005546909 nova_compute[187208]: 2025-12-05 12:08:52.106 187212 INFO nova.virt.libvirt.driver [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance destroyed successfully.#033[00m
Dec  5 07:08:52 np0005546909 nova_compute[187208]: 2025-12-05 12:08:52.107 187212 DEBUG nova.compute.manager [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:52 np0005546909 nova_compute[187208]: 2025-12-05 12:08:52.109 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:08:52 np0005546909 nova_compute[187208]: 2025-12-05 12:08:52.110 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:08:52 np0005546909 nova_compute[187208]: 2025-12-05 12:08:52.110 187212 DEBUG nova.network.neutron [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:08:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:08:52.794 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:52 np0005546909 nova_compute[187208]: 2025-12-05 12:08:52.936 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:53 np0005546909 podman[230207]: 2025-12-05 12:08:53.211249949 +0000 UTC m=+0.066897136 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.327 187212 DEBUG nova.network.neutron [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.348 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.980 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.981 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.982 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.982 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.982 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.983 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state shelved and task_state shelving_offloading.#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.983 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.983 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.984 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.984 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.984 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.985 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state shelved and task_state shelving_offloading.#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.985 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.985 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.986 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.986 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.986 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.987 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state shelved and task_state shelving_offloading.#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.987 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.987 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.988 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.988 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.988 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.989 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state shelved and task_state shelving_offloading.#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.989 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.989 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.990 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.990 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.990 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.991 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state shelved and task_state shelving_offloading.#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.991 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-unplugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.991 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.992 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.992 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.992 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-unplugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.993 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-unplugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state rescued and task_state None.#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.993 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.993 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.994 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.994 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.994 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.994 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state rescued and task_state None.#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.995 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.995 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.995 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.996 187212 DEBUG oslo_concurrency.lockutils [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.996 187212 DEBUG nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:08:54 np0005546909 nova_compute[187208]: 2025-12-05 12:08:54.996 187212 WARNING nova.compute.manager [req-ebb15bd4-33e3-4f1e-a955-fb741b7bd498 req-2dd88df9-cc51-4133-bf18-3946bfae634a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state rescued and task_state None.#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.698 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936520.6973228, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.699 187212 INFO nova.compute.manager [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.716 187212 DEBUG nova.compute.manager [None req-5c938c21-26de-4056-811d-29ebb3301ac8 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.719 187212 DEBUG nova.compute.manager [None req-5c938c21-26de-4056-811d-29ebb3301ac8 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved, current task_state: shelving_offloading, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.772 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.775 187212 INFO nova.compute.manager [None req-5c938c21-26de-4056-811d-29ebb3301ac8 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] During sync_power_state the instance has a pending task (shelving_offloading). Skip.#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.838 187212 INFO nova.virt.libvirt.driver [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance destroyed successfully.#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.838 187212 DEBUG nova.objects.instance [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'resources' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.852 187212 DEBUG nova.virt.libvirt.vif [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629320086',display_name='tempest-ServerActionsTestOtherB-server-1629320086',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629320086',id=54,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCLIKVsEL0lmma4upWYe8NiCB7ZJxacCmu4vu1RJu3M5/Fu5S7w/HUSIKvvOTrl/9nUJ4pE5tXIAyPQxQDsptmV5i8IinhFeAgIm0GlEBvfbCuuhpWud8F+u8GsIwgaqpQ==',key_name='tempest-keypair-776546213',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-l59qc6ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member',shelved_at='2025-12-05T12:08:51.833545',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='4d0314d0-2208-4446-8d20-5c2197f0bd9d'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:08:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=24358eea-14fb-4863-a6c4-aadcdb495f54,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.853 187212 DEBUG nova.network.os_vif_util [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.853 187212 DEBUG nova.network.os_vif_util [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.854 187212 DEBUG os_vif [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.856 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.857 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e9efd6c-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.858 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.862 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.863 187212 INFO os_vif [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74')#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.864 187212 INFO nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Deleting instance files /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54_del#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.870 187212 INFO nova.virt.libvirt.driver [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Deletion of /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54_del complete#033[00m
Dec  5 07:08:55 np0005546909 nova_compute[187208]: 2025-12-05 12:08:55.964 187212 INFO nova.scheduler.client.report [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Deleted allocations for instance 24358eea-14fb-4863-a6c4-aadcdb495f54#033[00m
Dec  5 07:08:56 np0005546909 nova_compute[187208]: 2025-12-05 12:08:56.020 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:08:56 np0005546909 nova_compute[187208]: 2025-12-05 12:08:56.021 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:08:56 np0005546909 nova_compute[187208]: 2025-12-05 12:08:56.057 187212 DEBUG nova.scheduler.client.report [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 07:08:56 np0005546909 nova_compute[187208]: 2025-12-05 12:08:56.077 187212 DEBUG nova.scheduler.client.report [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 07:08:56 np0005546909 nova_compute[187208]: 2025-12-05 12:08:56.078 187212 DEBUG nova.compute.provider_tree [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:08:56 np0005546909 nova_compute[187208]: 2025-12-05 12:08:56.099 187212 DEBUG nova.scheduler.client.report [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 07:08:56 np0005546909 nova_compute[187208]: 2025-12-05 12:08:56.179 187212 DEBUG nova.scheduler.client.report [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 07:08:56 np0005546909 nova_compute[187208]: 2025-12-05 12:08:56.266 187212 DEBUG nova.compute.provider_tree [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:08:56 np0005546909 nova_compute[187208]: 2025-12-05 12:08:56.349 187212 DEBUG nova.scheduler.client.report [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:08:56 np0005546909 nova_compute[187208]: 2025-12-05 12:08:56.371 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:56 np0005546909 nova_compute[187208]: 2025-12-05 12:08:56.419 187212 DEBUG oslo_concurrency.lockutils [None req-401a86d2-9c74-451e-9e25-0a169c664209 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 18.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:08:57 np0005546909 nova_compute[187208]: 2025-12-05 12:08:57.937 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.416 187212 DEBUG nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.416 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.416 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.416 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.417 187212 DEBUG nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.417 187212 WARNING nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state rescued and task_state None.#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.417 187212 DEBUG nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-08b15784-5374-4fb3-9f63-82412f709db4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.417 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.417 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.418 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.418 187212 DEBUG nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-08b15784-5374-4fb3-9f63-82412f709db4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.418 187212 WARNING nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-08b15784-5374-4fb3-9f63-82412f709db4 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.418 187212 DEBUG nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-08b15784-5374-4fb3-9f63-82412f709db4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.418 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.419 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.419 187212 DEBUG oslo_concurrency.lockutils [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.419 187212 DEBUG nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-08b15784-5374-4fb3-9f63-82412f709db4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.419 187212 WARNING nova.compute.manager [req-4f57ec37-e67d-4c36-ae8a-4ea27e2ccbad req-9562b78b-cead-4f29-92e2-dafce7fdf2ba 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-08b15784-5374-4fb3-9f63-82412f709db4 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:09:00 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:00Z|00658|binding|INFO|Releasing lport 0dffa729-6b55-4e58-afef-f1cdc22c22fb from this chassis (sb_readonly=0)
Dec  5 07:09:00 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:00Z|00659|binding|INFO|Releasing lport bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5 from this chassis (sb_readonly=0)
Dec  5 07:09:00 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:00Z|00660|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.846 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:00 np0005546909 nova_compute[187208]: 2025-12-05 12:09:00.859 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:01 np0005546909 podman[230236]: 2025-12-05 12:09:01.203316189 +0000 UTC m=+0.052806373 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:09:01 np0005546909 podman[230235]: 2025-12-05 12:09:01.209698532 +0000 UTC m=+0.062405998 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, io.openshift.expose-services=, release=1755695350)
Dec  5 07:09:02 np0005546909 nova_compute[187208]: 2025-12-05 12:09:02.939 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:03.015 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:03.015 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:03.016 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:03 np0005546909 nova_compute[187208]: 2025-12-05 12:09:03.981 187212 DEBUG nova.compute.manager [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:03 np0005546909 nova_compute[187208]: 2025-12-05 12:09:03.982 187212 DEBUG nova.compute.manager [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing instance network info cache due to event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:09:03 np0005546909 nova_compute[187208]: 2025-12-05 12:09:03.982 187212 DEBUG oslo_concurrency.lockutils [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:03 np0005546909 nova_compute[187208]: 2025-12-05 12:09:03.983 187212 DEBUG oslo_concurrency.lockutils [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:03 np0005546909 nova_compute[187208]: 2025-12-05 12:09:03.983 187212 DEBUG nova.network.neutron [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.100 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.101 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.121 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.187 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.188 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.199 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.200 187212 INFO nova.compute.claims [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.407 187212 DEBUG nova.compute.provider_tree [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.428 187212 DEBUG nova.scheduler.client.report [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.452 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.452 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.524 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.525 187212 DEBUG nova.network.neutron [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.543 187212 INFO nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.569 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.673 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.674 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.674 187212 INFO nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Creating image(s)#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.675 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "/var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.675 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "/var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.676 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "/var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.689 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.747 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.748 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.749 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.761 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.816 187212 DEBUG nova.policy [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f805540d6084f53aa7bd5a66912be58', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1bdbd9c8684c4b9b97e00725e41037eb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.824 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.824 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.861 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.863 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.863 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.920 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.921 187212 DEBUG nova.virt.disk.api [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Checking if we can resize image /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.922 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.980 187212 DEBUG oslo_concurrency.lockutils [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-d35fce09-856e-4ebf-b944-0c0953a9492b" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.980 187212 DEBUG oslo_concurrency.lockutils [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-d35fce09-856e-4ebf-b944-0c0953a9492b" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.988 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.988 187212 DEBUG nova.virt.disk.api [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Cannot resize image /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.989 187212 DEBUG nova.objects.instance [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lazy-loading 'migration_context' on Instance uuid dbbad270-1e3c-41e1-9173-c1b9df0ab2dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:04 np0005546909 nova_compute[187208]: 2025-12-05 12:09:04.998 187212 DEBUG nova.objects.instance [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'flavor' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.000 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.001 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Ensure instance console log exists: /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.001 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.002 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.002 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.018 187212 DEBUG nova.virt.libvirt.vif [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.019 187212 DEBUG nova.network.os_vif_util [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.019 187212 DEBUG nova.network.os_vif_util [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.023 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b8:01:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd35fce09-85"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.025 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b8:01:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd35fce09-85"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.030 187212 DEBUG nova.virt.libvirt.driver [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Attempting to detach device tapd35fce09-85 from instance f1e72d05-87e7-495d-9dbb-1a10b112c69f from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.030 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] detach device xml: <interface type="ethernet">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <mac address="fa:16:3e:b8:01:47"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <model type="virtio"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <mtu size="1442"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <target dev="tapd35fce09-85"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]: </interface>
Dec  5 07:09:05 np0005546909 nova_compute[187208]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.036 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b8:01:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd35fce09-85"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.040 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:b8:01:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd35fce09-85"/></interface>not found in domain: <domain type='kvm' id='73'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <name>instance-00000043</name>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <uuid>f1e72d05-87e7-495d-9dbb-1a10b112c69f</uuid>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:creationTime>2025-12-05 12:08:48</nova:creationTime>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:flavor name="m1.nano">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:memory>128</nova:memory>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:disk>1</nova:disk>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:swap>0</nova:swap>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:vcpus>1</nova:vcpus>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </nova:flavor>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:owner>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </nova:owner>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:ports>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:port uuid="d35fce09-856e-4ebf-b944-0c0953a9492b">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:port uuid="af04237a-1f79-4f68-a18e-1ceb4911605b">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:port uuid="08b15784-5374-4fb3-9f63-82412f709db4">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </nova:ports>
Dec  5 07:09:05 np0005546909 nova_compute[187208]: </nova:instance>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <memory unit='KiB'>131072</memory>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <vcpu placement='static'>1</vcpu>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <resource>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <partition>/machine</partition>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </resource>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <sysinfo type='smbios'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <entry name='manufacturer'>RDO</entry>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <entry name='product'>OpenStack Compute</entry>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <entry name='serial'>f1e72d05-87e7-495d-9dbb-1a10b112c69f</entry>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <entry name='uuid'>f1e72d05-87e7-495d-9dbb-1a10b112c69f</entry>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <entry name='family'>Virtual Machine</entry>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <boot dev='hd'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <smbios mode='sysinfo'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <vmcoreinfo state='on'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <cpu mode='custom' match='exact' check='full'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <model fallback='forbid'>EPYC-Rome</model>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <vendor>AMD</vendor>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='x2apic'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='tsc-deadline'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='hypervisor'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='tsc_adjust'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='spec-ctrl'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='stibp'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='ssbd'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='cmp_legacy'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='overflow-recov'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='succor'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='ibrs'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='amd-ssbd'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='virt-ssbd'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='lbrv'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='tsc-scale'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='vmcb-clean'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='flushbyasid'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='pause-filter'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='pfthreshold'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='svme-addr-chk'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='lfence-always-serializing'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='xsaves'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='svm'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='topoext'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='npt'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='nrip-save'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <clock offset='utc'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <timer name='pit' tickpolicy='delay'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <timer name='rtc' tickpolicy='catchup'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <timer name='hpet' present='no'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <on_poweroff>destroy</on_poweroff>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <on_reboot>restart</on_reboot>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <on_crash>destroy</on_crash>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <disk type='file' device='disk'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <driver name='qemu' type='qcow2' cache='none'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <source file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk' index='2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <backingStore type='file' index='3'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:        <format type='raw'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:        <source file='/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:        <backingStore/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      </backingStore>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target dev='vda' bus='virtio'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='virtio-disk0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <disk type='file' device='cdrom'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <driver name='qemu' type='raw' cache='none'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <source file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.config' index='1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <backingStore/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target dev='sda' bus='sata'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <readonly/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='sata0-0-0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='0' model='pcie-root'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pcie.0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='1' port='0x10'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='2' port='0x11'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='3' port='0x12'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.3'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='4' port='0x13'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.4'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='5' port='0x14'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.5'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='6' port='0x15'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.6'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='7' port='0x16'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.7'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='8' port='0x17'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.8'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='9' port='0x18'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.9'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='10' port='0x19'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.10'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='11' port='0x1a'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.11'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='12' port='0x1b'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.12'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='13' port='0x1c'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.13'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='14' port='0x1d'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.14'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='15' port='0x1e'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.15'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='16' port='0x1f'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.16'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='17' port='0x20'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.17'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='18' port='0x21'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.18'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='19' port='0x22'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.19'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='20' port='0x23'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.20'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='21' port='0x24'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.21'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='22' port='0x25'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.22'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='23' port='0x26'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.23'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='24' port='0x27'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.24'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='25' port='0x28'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.25'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-pci-bridge'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.26'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='usb'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='sata' index='0'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='ide'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <interface type='ethernet'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mac address='fa:16:3e:01:99:b0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target dev='tapf7a6775e-6d'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model type='virtio'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <driver name='vhost' rx_queue_size='512'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mtu size='1442'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='net0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <interface type='ethernet'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mac address='fa:16:3e:b8:01:47'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target dev='tapd35fce09-85'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model type='virtio'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <driver name='vhost' rx_queue_size='512'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mtu size='1442'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='net1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <interface type='ethernet'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mac address='fa:16:3e:54:f6:34'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target dev='tapaf04237a-1f'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model type='virtio'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <driver name='vhost' rx_queue_size='512'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mtu size='1442'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='net2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <interface type='ethernet'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mac address='fa:16:3e:d1:21:db'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target dev='tap08b15784-53'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model type='virtio'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <driver name='vhost' rx_queue_size='512'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mtu size='1442'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='net3'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <serial type='pty'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <source path='/dev/pts/4'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <log file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/console.log' append='off'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target type='isa-serial' port='0'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:        <model name='isa-serial'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      </target>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='serial0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <console type='pty' tty='/dev/pts/4'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <source path='/dev/pts/4'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <log file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/console.log' append='off'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target type='serial' port='0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='serial0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </console>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <input type='tablet' bus='usb'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='input0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='usb' bus='0' port='1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </input>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <input type='mouse' bus='ps2'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='input1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </input>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <input type='keyboard' bus='ps2'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='input2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </input>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <graphics type='vnc' port='5904' autoport='yes' listen='::0'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <listen type='address' address='::0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </graphics>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <audio id='1' type='none'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model type='virtio' heads='1' primary='yes'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='video0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <watchdog model='itco' action='reset'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='watchdog0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </watchdog>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <memballoon model='virtio'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <stats period='10'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='balloon0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <rng model='virtio'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <backend model='random'>/dev/urandom</backend>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='rng0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <label>system_u:system_r:svirt_t:s0:c138,c973</label>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c138,c973</imagelabel>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </seclabel>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <label>+107:+107</label>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <imagelabel>+107:+107</imagelabel>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </seclabel>
Dec  5 07:09:05 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:09:05 np0005546909 nova_compute[187208]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.041 187212 INFO nova.virt.libvirt.driver [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully detached device tapd35fce09-85 from instance f1e72d05-87e7-495d-9dbb-1a10b112c69f from the persistent domain config.#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.041 187212 DEBUG nova.virt.libvirt.driver [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] (1/8): Attempting to detach device tapd35fce09-85 with device alias net1 from instance f1e72d05-87e7-495d-9dbb-1a10b112c69f from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.041 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] detach device xml: <interface type="ethernet">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <mac address="fa:16:3e:b8:01:47"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <model type="virtio"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <mtu size="1442"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <target dev="tapd35fce09-85"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]: </interface>
Dec  5 07:09:05 np0005546909 nova_compute[187208]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Dec  5 07:09:05 np0005546909 kernel: tapd35fce09-85 (unregistering): left promiscuous mode
Dec  5 07:09:05 np0005546909 NetworkManager[55691]: <info>  [1764936545.1032] device (tapd35fce09-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:09:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:05Z|00661|binding|INFO|Releasing lport d35fce09-856e-4ebf-b944-0c0953a9492b from this chassis (sb_readonly=0)
Dec  5 07:09:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:05Z|00662|binding|INFO|Setting lport d35fce09-856e-4ebf-b944-0c0953a9492b down in Southbound
Dec  5 07:09:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:05Z|00663|binding|INFO|Removing iface tapd35fce09-85 ovn-installed in OVS
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.135 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.145 187212 DEBUG nova.virt.libvirt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Received event <DeviceRemovedEvent: 1764936545.1446915, f1e72d05-87e7-495d-9dbb-1a10b112c69f => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.147 187212 DEBUG nova.virt.libvirt.driver [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Start waiting for the detach event from libvirt for device tapd35fce09-85 with device alias net1 for instance f1e72d05-87e7-495d-9dbb-1a10b112c69f _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.147 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:b8:01:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd35fce09-85"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.151 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:b8:01:47"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd35fce09-85"/></interface>not found in domain: <domain type='kvm' id='73'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <name>instance-00000043</name>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <uuid>f1e72d05-87e7-495d-9dbb-1a10b112c69f</uuid>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:creationTime>2025-12-05 12:08:48</nova:creationTime>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:flavor name="m1.nano">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:memory>128</nova:memory>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:disk>1</nova:disk>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:swap>0</nova:swap>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:vcpus>1</nova:vcpus>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </nova:flavor>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:owner>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </nova:owner>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:ports>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:port uuid="d35fce09-856e-4ebf-b944-0c0953a9492b">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:port uuid="af04237a-1f79-4f68-a18e-1ceb4911605b">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:port uuid="08b15784-5374-4fb3-9f63-82412f709db4">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </nova:ports>
Dec  5 07:09:05 np0005546909 nova_compute[187208]: </nova:instance>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <memory unit='KiB'>131072</memory>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <currentMemory unit='KiB'>131072</currentMemory>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <vcpu placement='static'>1</vcpu>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <resource>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <partition>/machine</partition>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </resource>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <sysinfo type='smbios'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <entry name='manufacturer'>RDO</entry>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <entry name='product'>OpenStack Compute</entry>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <entry name='serial'>f1e72d05-87e7-495d-9dbb-1a10b112c69f</entry>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <entry name='uuid'>f1e72d05-87e7-495d-9dbb-1a10b112c69f</entry>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <entry name='family'>Virtual Machine</entry>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <boot dev='hd'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <smbios mode='sysinfo'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <vmcoreinfo state='on'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <cpu mode='custom' match='exact' check='full'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <model fallback='forbid'>EPYC-Rome</model>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <vendor>AMD</vendor>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='x2apic'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='tsc-deadline'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='hypervisor'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='tsc_adjust'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='spec-ctrl'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='stibp'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='ssbd'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='cmp_legacy'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='overflow-recov'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='succor'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='ibrs'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='amd-ssbd'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='virt-ssbd'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='lbrv'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='tsc-scale'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='vmcb-clean'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='flushbyasid'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='pause-filter'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='pfthreshold'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='svme-addr-chk'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='lfence-always-serializing'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='xsaves'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='svm'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='require' name='topoext'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='npt'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <feature policy='disable' name='nrip-save'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <clock offset='utc'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <timer name='pit' tickpolicy='delay'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <timer name='rtc' tickpolicy='catchup'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <timer name='hpet' present='no'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <on_poweroff>destroy</on_poweroff>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <on_reboot>restart</on_reboot>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <on_crash>destroy</on_crash>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <disk type='file' device='disk'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <driver name='qemu' type='qcow2' cache='none'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <source file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk' index='2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <backingStore type='file' index='3'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:        <format type='raw'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:        <source file='/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:        <backingStore/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      </backingStore>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target dev='vda' bus='virtio'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='virtio-disk0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <disk type='file' device='cdrom'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <driver name='qemu' type='raw' cache='none'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <source file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/disk.config' index='1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <backingStore/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target dev='sda' bus='sata'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <readonly/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='sata0-0-0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='0' model='pcie-root'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pcie.0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='1' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='1' port='0x10'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='2' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='2' port='0x11'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='3' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='3' port='0x12'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.3'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='4' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='4' port='0x13'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.4'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='5' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='5' port='0x14'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.5'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='6' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='6' port='0x15'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.6'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='7' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='7' port='0x16'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.7'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='8' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='8' port='0x17'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.8'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='9' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='9' port='0x18'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.9'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='10' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='10' port='0x19'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.10'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='11' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='11' port='0x1a'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.11'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='12' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='12' port='0x1b'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.12'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='13' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='13' port='0x1c'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.13'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='14' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='14' port='0x1d'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.14'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='15' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='15' port='0x1e'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.15'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='16' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='16' port='0x1f'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.16'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='17' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='17' port='0x20'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.17'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='18' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='18' port='0x21'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.18'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='19' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='19' port='0x22'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.19'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='20' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='20' port='0x23'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.20'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='21' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='21' port='0x24'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.21'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='22' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='22' port='0x25'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.22'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='23' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='23' port='0x26'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.23'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='24' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='24' port='0x27'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.24'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='25' model='pcie-root-port'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-root-port'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target chassis='25' port='0x28'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.25'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model name='pcie-pci-bridge'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='pci.26'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='usb' index='0' model='piix3-uhci'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='usb'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <controller type='sata' index='0'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='ide'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </controller>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <interface type='ethernet'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mac address='fa:16:3e:01:99:b0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target dev='tapf7a6775e-6d'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model type='virtio'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <driver name='vhost' rx_queue_size='512'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mtu size='1442'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='net0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <interface type='ethernet'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mac address='fa:16:3e:54:f6:34'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target dev='tapaf04237a-1f'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model type='virtio'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <driver name='vhost' rx_queue_size='512'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mtu size='1442'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='net2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <interface type='ethernet'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mac address='fa:16:3e:d1:21:db'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target dev='tap08b15784-53'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model type='virtio'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <driver name='vhost' rx_queue_size='512'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <mtu size='1442'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='net3'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <serial type='pty'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <source path='/dev/pts/4'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <log file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/console.log' append='off'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target type='isa-serial' port='0'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:        <model name='isa-serial'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      </target>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='serial0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <console type='pty' tty='/dev/pts/4'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <source path='/dev/pts/4'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <log file='/var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f/console.log' append='off'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <target type='serial' port='0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='serial0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </console>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <input type='tablet' bus='usb'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='input0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='usb' bus='0' port='1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </input>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <input type='mouse' bus='ps2'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='input1'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </input>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <input type='keyboard' bus='ps2'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='input2'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </input>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <graphics type='vnc' port='5904' autoport='yes' listen='::0'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <listen type='address' address='::0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </graphics>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <audio id='1' type='none'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <model type='virtio' heads='1' primary='yes'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='video0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <watchdog model='itco' action='reset'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='watchdog0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </watchdog>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <memballoon model='virtio'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <stats period='10'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='balloon0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <rng model='virtio'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <backend model='random'>/dev/urandom</backend>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <alias name='rng0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <label>system_u:system_r:svirt_t:s0:c138,c973</label>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c138,c973</imagelabel>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </seclabel>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <label>+107:+107</label>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <imagelabel>+107:+107</imagelabel>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </seclabel>
Dec  5 07:09:05 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:09:05 np0005546909 nova_compute[187208]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.151 187212 INFO nova.virt.libvirt.driver [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully detached device tapd35fce09-85 from instance f1e72d05-87e7-495d-9dbb-1a10b112c69f from the live domain config.#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.152 187212 DEBUG nova.virt.libvirt.vif [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.152 187212 DEBUG nova.network.os_vif_util [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "d35fce09-856e-4ebf-b944-0c0953a9492b", "address": "fa:16:3e:b8:01:47", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd35fce09-85", "ovs_interfaceid": "d35fce09-856e-4ebf-b944-0c0953a9492b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.153 187212 DEBUG nova.network.os_vif_util [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.153 187212 DEBUG os_vif [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.155 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.155 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd35fce09-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.156 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.159 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.161 187212 INFO os_vif [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:01:47,bridge_name='br-int',has_traffic_filtering=True,id=d35fce09-856e-4ebf-b944-0c0953a9492b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd35fce09-85')#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.162 187212 DEBUG nova.virt.libvirt.guest [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1099990882</nova:name>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:creationTime>2025-12-05 12:09:05</nova:creationTime>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:flavor name="m1.nano">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:memory>128</nova:memory>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:disk>1</nova:disk>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:swap>0</nova:swap>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:vcpus>1</nova:vcpus>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </nova:flavor>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:owner>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </nova:owner>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  <nova:ports>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:port uuid="f7a6775e-6d9c-48e1-91d7-829a6f5f3742">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:port uuid="af04237a-1f79-4f68-a18e-1ceb4911605b">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    <nova:port uuid="08b15784-5374-4fb3-9f63-82412f709db4">
Dec  5 07:09:05 np0005546909 nova_compute[187208]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:    </nova:port>
Dec  5 07:09:05 np0005546909 nova_compute[187208]:  </nova:ports>
Dec  5 07:09:05 np0005546909 nova_compute[187208]: </nova:instance>
Dec  5 07:09:05 np0005546909 nova_compute[187208]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.216 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:01:47 10.100.0.3'], port_security=['fa:16:3e:b8:01:47 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=d35fce09-856e-4ebf-b944-0c0953a9492b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.217 104471 INFO neutron.agent.ovn.metadata.agent [-] Port d35fce09-856e-4ebf-b944-0c0953a9492b in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.220 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.235 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d869dfb1-086b-4ab7-8725-0b60458d6585]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.263 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[29df3787-cf94-4b95-b554-e4ec87fd7c68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.268 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[905dcdd7-aa67-483e-8f71-91169d3cf6d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.303 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0570f9ea-1ebf-4865-8be4-6cf6d8a2d058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.319 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[87fb661e-e645-48db-ae94-b8494bf7284a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 12, 'rx_bytes': 868, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 12, 'rx_bytes': 868, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 15698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230300, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.341 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f08b6aa6-0197-44c3-8eca-0054f1988f1a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383494, 'tstamp': 383494}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230301, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383497, 'tstamp': 383497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230301, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.343 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.345 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:05 np0005546909 nova_compute[187208]: 2025-12-05 12:09:05.346 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.346 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.346 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.347 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:05.347 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:06 np0005546909 podman[230302]: 2025-12-05 12:09:06.201292387 +0000 UTC m=+0.053076801 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:09:06 np0005546909 podman[230303]: 2025-12-05 12:09:06.245133012 +0000 UTC m=+0.084528851 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Dec  5 07:09:06 np0005546909 nova_compute[187208]: 2025-12-05 12:09:06.725 187212 DEBUG nova.network.neutron [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Successfully created port: cf99cdda-7071-4c18-8462-3a556234d81d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:09:07 np0005546909 nova_compute[187208]: 2025-12-05 12:09:07.981 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.011 187212 DEBUG nova.network.neutron [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updated VIF entry in instance network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.012 187212 DEBUG nova.network.neutron [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.120 187212 DEBUG oslo_concurrency.lockutils [req-dbab409c-8f71-4e11-ba08-3e8edba5d911 req-e50cb29c-6731-4434-82fd-d43846c4d8bb 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.380 187212 DEBUG oslo_concurrency.lockutils [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.380 187212 DEBUG oslo_concurrency.lockutils [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.380 187212 DEBUG nova.network.neutron [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.411 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.412 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.412 187212 INFO nova.compute.manager [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Unshelving#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.494 187212 DEBUG nova.network.neutron [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Successfully updated port: cf99cdda-7071-4c18-8462-3a556234d81d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.512 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "refresh_cache-dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.513 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquired lock "refresh_cache-dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.513 187212 DEBUG nova.network.neutron [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.522 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.523 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.527 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'pci_requests' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.551 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'numa_topology' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.570 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.570 187212 INFO nova.compute.claims [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.676 187212 DEBUG nova.network.neutron [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.787 187212 DEBUG nova.compute.provider_tree [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.802 187212 DEBUG nova.scheduler.client.report [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:09:08 np0005546909 nova_compute[187208]: 2025-12-05 12:09:08.822 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.438 187212 INFO nova.network.neutron [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating port 2e9efd6c-740c-405b-b9f0-bd46434070a7 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.483 187212 DEBUG nova.network.neutron [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Updating instance_info_cache with network_info: [{"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.517 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Releasing lock "refresh_cache-dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.518 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Instance network_info: |[{"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.521 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Start _get_guest_xml network_info=[{"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.527 187212 WARNING nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.531 187212 DEBUG nova.virt.libvirt.host [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.531 187212 DEBUG nova.virt.libvirt.host [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.535 187212 DEBUG nova.virt.libvirt.host [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.536 187212 DEBUG nova.virt.libvirt.host [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.537 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.538 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.539 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.539 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.539 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.539 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.540 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.540 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.540 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.540 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.540 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.541 187212 DEBUG nova.virt.hardware [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.545 187212 DEBUG nova.virt.libvirt.vif [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:09:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1390207148',display_name='tempest-ServerMetadataTestJSON-server-1390207148',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1390207148',id=72,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1bdbd9c8684c4b9b97e00725e41037eb',ramdisk_id='',reservation_id='r-8eo91pmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-355236921',owner_user_name='tempest-ServerMetadataTestJSON-355236921-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:04Z,user_data=None,user_id='4f805540d6084f53aa7bd5a66912be58',uuid=dbbad270-1e3c-41e1-9173-c1b9df0ab2dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.546 187212 DEBUG nova.network.os_vif_util [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Converting VIF {"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.547 187212 DEBUG nova.network.os_vif_util [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.547 187212 DEBUG nova.objects.instance [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lazy-loading 'pci_devices' on Instance uuid dbbad270-1e3c-41e1-9173-c1b9df0ab2dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.563 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  <uuid>dbbad270-1e3c-41e1-9173-c1b9df0ab2dd</uuid>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  <name>instance-00000048</name>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerMetadataTestJSON-server-1390207148</nova:name>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:09:09</nova:creationTime>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:        <nova:user uuid="4f805540d6084f53aa7bd5a66912be58">tempest-ServerMetadataTestJSON-355236921-project-member</nova:user>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:        <nova:project uuid="1bdbd9c8684c4b9b97e00725e41037eb">tempest-ServerMetadataTestJSON-355236921</nova:project>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:        <nova:port uuid="cf99cdda-7071-4c18-8462-3a556234d81d">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <entry name="serial">dbbad270-1e3c-41e1-9173-c1b9df0ab2dd</entry>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <entry name="uuid">dbbad270-1e3c-41e1-9173-c1b9df0ab2dd</entry>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.config"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:60:68:ad"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <target dev="tapcf99cdda-70"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/console.log" append="off"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:09:09 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:09:09 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:09:09 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:09:09 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.564 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Preparing to wait for external event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.564 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.565 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.565 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.566 187212 DEBUG nova.virt.libvirt.vif [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:09:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1390207148',display_name='tempest-ServerMetadataTestJSON-server-1390207148',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1390207148',id=72,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1bdbd9c8684c4b9b97e00725e41037eb',ramdisk_id='',reservation_id='r-8eo91pmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-355236921',owner_user_name='tempest-ServerMetadataTestJSON-355236921-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:04Z,user_data=None,user_id='4f805540d6084f53aa7bd5a66912be58',uuid=dbbad270-1e3c-41e1-9173-c1b9df0ab2dd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.566 187212 DEBUG nova.network.os_vif_util [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Converting VIF {"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.567 187212 DEBUG nova.network.os_vif_util [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.567 187212 DEBUG os_vif [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.567 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.568 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.568 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.573 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.574 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcf99cdda-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.574 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcf99cdda-70, col_values=(('external_ids', {'iface-id': 'cf99cdda-7071-4c18-8462-3a556234d81d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:68:ad', 'vm-uuid': 'dbbad270-1e3c-41e1-9173-c1b9df0ab2dd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.576 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:09 np0005546909 NetworkManager[55691]: <info>  [1764936549.5774] manager: (tapcf99cdda-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/260)
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.578 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.582 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.584 187212 INFO os_vif [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70')#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.659 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.659 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.660 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] No VIF found with MAC fa:16:3e:60:68:ad, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:09:09 np0005546909 nova_compute[187208]: 2025-12-05 12:09:09.660 187212 INFO nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Using config drive#033[00m
Dec  5 07:09:09 np0005546909 podman[230355]: 2025-12-05 12:09:09.690161756 +0000 UTC m=+0.066311169 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  5 07:09:11 np0005546909 nova_compute[187208]: 2025-12-05 12:09:11.474 187212 INFO nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Creating config drive at /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.config#033[00m
Dec  5 07:09:11 np0005546909 nova_compute[187208]: 2025-12-05 12:09:11.479 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5l9s9dwu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:11 np0005546909 nova_compute[187208]: 2025-12-05 12:09:11.608 187212 DEBUG oslo_concurrency.processutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5l9s9dwu" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:11 np0005546909 kernel: tapcf99cdda-70: entered promiscuous mode
Dec  5 07:09:11 np0005546909 NetworkManager[55691]: <info>  [1764936551.6797] manager: (tapcf99cdda-70): new Tun device (/org/freedesktop/NetworkManager/Devices/261)
Dec  5 07:09:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:11Z|00664|binding|INFO|Claiming lport cf99cdda-7071-4c18-8462-3a556234d81d for this chassis.
Dec  5 07:09:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:11Z|00665|binding|INFO|cf99cdda-7071-4c18-8462-3a556234d81d: Claiming fa:16:3e:60:68:ad 10.100.0.4
Dec  5 07:09:11 np0005546909 nova_compute[187208]: 2025-12-05 12:09:11.680 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.688 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:68:ad 10.100.0.4'], port_security=['fa:16:3e:60:68:ad 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dbbad270-1e3c-41e1-9173-c1b9df0ab2dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1bdbd9c8684c4b9b97e00725e41037eb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d91f504-323f-40f6-96ee-8e841aa785bd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd596033-693a-40ca-949c-841d866181bd, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=cf99cdda-7071-4c18-8462-3a556234d81d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.689 104471 INFO neutron.agent.ovn.metadata.agent [-] Port cf99cdda-7071-4c18-8462-3a556234d81d in datapath d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 bound to our chassis#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.691 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22#033[00m
Dec  5 07:09:11 np0005546909 nova_compute[187208]: 2025-12-05 12:09:11.695 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:11Z|00666|binding|INFO|Setting lport cf99cdda-7071-4c18-8462-3a556234d81d ovn-installed in OVS
Dec  5 07:09:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:11Z|00667|binding|INFO|Setting lport cf99cdda-7071-4c18-8462-3a556234d81d up in Southbound
Dec  5 07:09:11 np0005546909 nova_compute[187208]: 2025-12-05 12:09:11.699 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.702 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c9300f9f-6a5b-4543-b835-c2fc98d5e57a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.703 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd5794fbb-c1 in ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.705 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd5794fbb-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.705 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca3547f-8479-48e4-a53f-9c7db733cbdd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 systemd-udevd[230394]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.707 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f207c841-7baf-4e47-b436-0d412d33d5c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 systemd-machined[153543]: New machine qemu-80-instance-00000048.
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.721 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[0d16414f-662d-42cc-8f3e-202703c9c6c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 NetworkManager[55691]: <info>  [1764936551.7257] device (tapcf99cdda-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:09:11 np0005546909 NetworkManager[55691]: <info>  [1764936551.7268] device (tapcf99cdda-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:09:11 np0005546909 systemd[1]: Started Virtual Machine qemu-80-instance-00000048.
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.740 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e3171669-0cd5-4223-8d8a-b14d37242f4d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.779 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a6e316cc-da70-4c16-8f8f-6f0123c15695]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 NetworkManager[55691]: <info>  [1764936551.7872] manager: (tapd5794fbb-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/262)
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.786 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[131ee832-c341-4bfc-8177-27e41ab163e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.817 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9725e671-52ad-4f71-abe2-1d8069d174dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.821 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[297396c9-8ee8-4398-8338-0e8387ec7843]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 NetworkManager[55691]: <info>  [1764936551.8441] device (tapd5794fbb-c0): carrier: link connected
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.848 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e53a3cda-9c50-4d9b-877a-d19cde7c566d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.864 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9547a7da-6803-402a-996e-9d2008ed683e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5794fbb-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:2f:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393240, 'reachable_time': 34098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230427, 'error': None, 'target': 'ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.879 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a84495b8-65d7-472f-8248-9ba2b2cf53f6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe82:2f2a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 393240, 'tstamp': 393240}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230428, 'error': None, 'target': 'ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.897 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[42fba68c-9267-4c51-a149-f70e62462ad7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd5794fbb-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:2f:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393240, 'reachable_time': 34098, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230429, 'error': None, 'target': 'ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.930 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd2441e-a421-4fa1-bb24-60aeb06ce054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.995 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5ce8487c-f9a7-4d6b-ab67-f7d6c07327f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.997 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5794fbb-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.997 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:11.997 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5794fbb-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:12 np0005546909 kernel: tapd5794fbb-c0: entered promiscuous mode
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:11.999 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:12 np0005546909 NetworkManager[55691]: <info>  [1764936552.0011] manager: (tapd5794fbb-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/263)
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:12.002 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd5794fbb-c0, col_values=(('external_ids', {'iface-id': 'da9adcd8-f2a5-4ff7-962a-717d700ad7b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:12Z|00668|binding|INFO|Releasing lport da9adcd8-f2a5-4ff7-962a-717d700ad7b5 from this chassis (sb_readonly=0)
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.003 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.015 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:12.017 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:12.018 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f958c892-c100-4fde-8caa-f87eb2178987]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:12.018 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22.pid.haproxy
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:09:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:12.019 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'env', 'PROCESS_TAG=haproxy-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.302 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936552.3021243, dbbad270-1e3c-41e1-9173-c1b9df0ab2dd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.303 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] VM Started (Lifecycle Event)#033[00m
Dec  5 07:09:12 np0005546909 podman[230467]: 2025-12-05 12:09:12.460873714 +0000 UTC m=+0.069047198 container create 34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:09:12 np0005546909 systemd[1]: Started libpod-conmon-34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85.scope.
Dec  5 07:09:12 np0005546909 podman[230467]: 2025-12-05 12:09:12.41917767 +0000 UTC m=+0.027351194 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.521 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.527 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936552.3024373, dbbad270-1e3c-41e1-9173-c1b9df0ab2dd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.527 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:09:12 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:09:12 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58192d43ddea201cbf99dd4a079d9f14871a3e9699282f5a030bf56f666b8ee1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:09:12 np0005546909 podman[230467]: 2025-12-05 12:09:12.566874089 +0000 UTC m=+0.175047593 container init 34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:09:12 np0005546909 podman[230467]: 2025-12-05 12:09:12.574040834 +0000 UTC m=+0.182214318 container start 34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:09:12 np0005546909 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [NOTICE]   (230487) : New worker (230489) forked
Dec  5 07:09:12 np0005546909 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [NOTICE]   (230487) : Loading success.
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.657 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.666 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.692 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.692 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.692 187212 DEBUG nova.network.neutron [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.781 187212 INFO nova.network.neutron [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Port d35fce09-856e-4ebf-b944-0c0953a9492b from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.809 187212 DEBUG nova.compute.manager [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.809 187212 DEBUG nova.compute.manager [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing instance network info cache due to event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.809 187212 DEBUG oslo_concurrency.lockutils [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.809 187212 DEBUG oslo_concurrency.lockutils [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.810 187212 DEBUG nova.network.neutron [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.835 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.941 187212 DEBUG nova.compute.manager [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-changed-cf99cdda-7071-4c18-8462-3a556234d81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.941 187212 DEBUG nova.compute.manager [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Refreshing instance network info cache due to event network-changed-cf99cdda-7071-4c18-8462-3a556234d81d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.942 187212 DEBUG oslo_concurrency.lockutils [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.942 187212 DEBUG oslo_concurrency.lockutils [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.942 187212 DEBUG nova.network.neutron [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Refreshing network info cache for port cf99cdda-7071-4c18-8462-3a556234d81d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:09:12 np0005546909 nova_compute[187208]: 2025-12-05 12:09:12.983 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:14 np0005546909 nova_compute[187208]: 2025-12-05 12:09:14.215 187212 DEBUG nova.network.neutron [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updated VIF entry in instance network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:09:14 np0005546909 nova_compute[187208]: 2025-12-05 12:09:14.216 187212 DEBUG nova.network.neutron [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:14 np0005546909 nova_compute[187208]: 2025-12-05 12:09:14.235 187212 DEBUG oslo_concurrency.lockutils [req-cf33f03f-7f2f-4d51-a3ee-ed6138c94bf7 req-c4f467cb-735f-46da-995f-ff6b15c8e8e5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.420 104471 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 142f48b6-9a20-4cd8-b984-7849deca313b with type ""#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.422 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:21:db 10.100.0.14'], port_security=['fa:16:3e:d1:21:db 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-274660127', 'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-274660127', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=08b15784-5374-4fb3-9f63-82412f709db4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.424 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 08b15784-5374-4fb3-9f63-82412f709db4 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.426 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c#033[00m
Dec  5 07:09:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:14Z|00669|binding|INFO|Removing iface tap08b15784-53 ovn-installed in OVS
Dec  5 07:09:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:14Z|00670|binding|INFO|Removing lport 08b15784-5374-4fb3-9f63-82412f709db4 ovn-installed in OVS
Dec  5 07:09:14 np0005546909 nova_compute[187208]: 2025-12-05 12:09:14.438 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.442 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[92a6b6a8-87a3-4e69-bb8d-0eb474f23746]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:14 np0005546909 nova_compute[187208]: 2025-12-05 12:09:14.449 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.473 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[617e49b8-a5ed-4a06-b8a6-d0acf1f77a40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.479 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd1c8d4-183c-4b66-b8f3-d01b51b8ed03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.513 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e5259931-f2be-47ab-8f9d-9b3b1db9718b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.534 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1f15a823-8b25-4d21-96b6-9b8865eb4522]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 14, 'rx_bytes': 868, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 14, 'rx_bytes': 868, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 15698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230503, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.553 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[71e0cf1f-20b9-45c0-a603-ce68116fced2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383494, 'tstamp': 383494}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230504, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383497, 'tstamp': 383497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230504, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.555 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:14 np0005546909 nova_compute[187208]: 2025-12-05 12:09:14.557 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:14 np0005546909 nova_compute[187208]: 2025-12-05 12:09:14.559 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.559 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.560 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.560 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:14.561 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:14 np0005546909 nova_compute[187208]: 2025-12-05 12:09:14.576 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.220 187212 DEBUG nova.network.neutron [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.407 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.409 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.409 187212 INFO nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Creating image(s)#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.410 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.410 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.411 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.411 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.415 187212 DEBUG nova.network.neutron [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Updated VIF entry in instance network info cache for port cf99cdda-7071-4c18-8462-3a556234d81d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.416 187212 DEBUG nova.network.neutron [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Updating instance_info_cache with network_info: [{"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.455 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "4e67c74a736d89d49bae230086f8944c0448c13d" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.456 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "4e67c74a736d89d49bae230086f8944c0448c13d" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:16 np0005546909 nova_compute[187208]: 2025-12-05 12:09:16.459 187212 DEBUG oslo_concurrency.lockutils [req-4185ce17-0f7b-4a17-9f62-1d18da1964e6 req-6ee6e79a-8417-4a5b-8a31-be020962b2a0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:09:17 np0005546909 podman[230505]: 2025-12-05 12:09:17.221354331 +0000 UTC m=+0.065086434 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:09:17 np0005546909 nova_compute[187208]: 2025-12-05 12:09:17.985 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.341 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.402 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.part --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.403 187212 DEBUG nova.virt.images [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] 4d0314d0-2208-4446-8d20-5c2197f0bd9d was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.405 187212 DEBUG nova.privsep.utils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.405 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.part /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:18 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:09:18 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.616 187212 DEBUG nova.network.neutron [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.741 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.part /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.converted" returned: 0 in 0.336s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.751 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.812 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d.converted --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.814 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "4e67c74a736d89d49bae230086f8944c0448c13d" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.358s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.833 187212 DEBUG oslo_concurrency.lockutils [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-f1e72d05-87e7-495d-9dbb-1a10b112c69f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.840 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.907 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.908 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "4e67c74a736d89d49bae230086f8944c0448c13d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.909 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "4e67c74a736d89d49bae230086f8944c0448c13d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.925 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.964 187212 DEBUG oslo_concurrency.lockutils [None req-f2f83d6d-0f6d-4d04-a4ca-cc140a76c4c9 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "interface-f1e72d05-87e7-495d-9dbb-1a10b112c69f-d35fce09-856e-4ebf-b944-0c0953a9492b" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 13.983s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.993 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:18 np0005546909 nova_compute[187208]: 2025-12-05 12:09:18.993 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d,backing_fmt=raw /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.056 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d,backing_fmt=raw /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk 1073741824" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.057 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "4e67c74a736d89d49bae230086f8944c0448c13d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.058 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.116 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.118 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'migration_context' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.143 187212 INFO nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Rebasing disk image.#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.144 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.201 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.202 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 -F raw /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.431 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.431 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.432 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.432 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.432 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.434 187212 INFO nova.compute.manager [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Terminating instance#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.436 187212 DEBUG nova.compute.manager [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:09:19 np0005546909 kernel: tapf7a6775e-6d (unregistering): left promiscuous mode
Dec  5 07:09:19 np0005546909 NetworkManager[55691]: <info>  [1764936559.4793] device (tapf7a6775e-6d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.495 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 kernel: tapaf04237a-1f (unregistering): left promiscuous mode
Dec  5 07:09:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:19Z|00671|binding|INFO|Releasing lport f7a6775e-6d9c-48e1-91d7-829a6f5f3742 from this chassis (sb_readonly=0)
Dec  5 07:09:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:19Z|00672|binding|INFO|Setting lport f7a6775e-6d9c-48e1-91d7-829a6f5f3742 down in Southbound
Dec  5 07:09:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:19Z|00673|binding|INFO|Removing iface tapf7a6775e-6d ovn-installed in OVS
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.502 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-unplugged-d35fce09-856e-4ebf-b944-0c0953a9492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.503 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.503 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:19 np0005546909 NetworkManager[55691]: <info>  [1764936559.5041] device (tapaf04237a-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.504 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.504 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-unplugged-d35fce09-856e-4ebf-b944-0c0953a9492b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.504 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-unplugged-d35fce09-856e-4ebf-b944-0c0953a9492b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.504 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.505 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.505 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.505 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.505 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.505 187212 WARNING nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-d35fce09-856e-4ebf-b944-0c0953a9492b for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.506 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-deleted-d35fce09-856e-4ebf-b944-0c0953a9492b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.506 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-changed-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.506 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Refreshing instance network info cache due to event network-changed-2e9efd6c-740c-405b-b9f0-bd46434070a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.506 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.507 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.507 187212 DEBUG nova.network.neutron [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Refreshing network info cache for port 2e9efd6c-740c-405b-b9f0-bd46434070a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.508 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.514 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:99:b0 10.100.0.7'], port_security=['fa:16:3e:01:99:b0 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '83c79c65-073e-4860-a990-92e9abafc0bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f7a6775e-6d9c-48e1-91d7-829a6f5f3742) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.515 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f7a6775e-6d9c-48e1-91d7-829a6f5f3742 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.518 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c#033[00m
Dec  5 07:09:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:19Z|00674|binding|INFO|Releasing lport af04237a-1f79-4f68-a18e-1ceb4911605b from this chassis (sb_readonly=0)
Dec  5 07:09:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:19Z|00675|binding|INFO|Setting lport af04237a-1f79-4f68-a18e-1ceb4911605b down in Southbound
Dec  5 07:09:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:19Z|00676|binding|INFO|Removing iface tapaf04237a-1f ovn-installed in OVS
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.538 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:f6:34 10.100.0.10'], port_security=['fa:16:3e:54:f6:34 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f1e72d05-87e7-495d-9dbb-1a10b112c69f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'da2c1744-fe64-413a-81b2-519102613e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=af04237a-1f79-4f68-a18e-1ceb4911605b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.537 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee47265-f580-4cd2-abf7-d97d67d870c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.530 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.544 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 kernel: tap08b15784-53 (unregistering): left promiscuous mode
Dec  5 07:09:19 np0005546909 NetworkManager[55691]: <info>  [1764936559.5517] device (tap08b15784-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.573 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.575 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[30fde81e-7760-47e5-8e97-93a9970aa448]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.578 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[10ae17e4-043e-4b0e-8110-c5556dcd5ba2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:19 np0005546909 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000043.scope: Deactivated successfully.
Dec  5 07:09:19 np0005546909 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000043.scope: Consumed 16.764s CPU time.
Dec  5 07:09:19 np0005546909 systemd-machined[153543]: Machine qemu-73-instance-00000043 terminated.
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.604 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b8bfbf65-b3ae-42eb-8a2e-91392e665dc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.621 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8a25b2ee-f1a1-4014-864b-8c8b22f6c55e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 16, 'rx_bytes': 868, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 16, 'rx_bytes': 868, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 159], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383482, 'reachable_time': 15698, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230586, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.636 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2a1c37f4-c312-4340-bc03-419089ce5079]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383494, 'tstamp': 383494}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230587, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 383497, 'tstamp': 383497}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230587, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.637 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.639 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.647 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.648 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.648 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.649 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.649 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.650 104471 INFO neutron.agent.ovn.metadata.agent [-] Port af04237a-1f79-4f68-a18e-1ceb4911605b in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c unbound from our chassis#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.652 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.653 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d801f682-2a78-4d4c-8b7e-ca2378352244]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:19.654 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c namespace which is not needed anymore#033[00m
Dec  5 07:09:19 np0005546909 NetworkManager[55691]: <info>  [1764936559.6698] manager: (tapaf04237a-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/264)
Dec  5 07:09:19 np0005546909 NetworkManager[55691]: <info>  [1764936559.6828] manager: (tap08b15784-53): new Tun device (/org/freedesktop/NetworkManager/Devices/265)
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.722 187212 INFO nova.virt.libvirt.driver [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Instance destroyed successfully.#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.723 187212 DEBUG nova.objects.instance [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'resources' on Instance uuid f1e72d05-87e7-495d-9dbb-1a10b112c69f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.739 187212 DEBUG nova.virt.libvirt.vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.740 187212 DEBUG nova.network.os_vif_util [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.741 187212 DEBUG nova.network.os_vif_util [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.741 187212 DEBUG os_vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.744 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a6775e-6d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.749 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.750 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.752 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.755 187212 INFO os_vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:99:b0,bridge_name='br-int',has_traffic_filtering=True,id=f7a6775e-6d9c-48e1-91d7-829a6f5f3742,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a6775e-6d')#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.756 187212 DEBUG nova.virt.libvirt.vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.757 187212 DEBUG nova.network.os_vif_util [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.759 187212 DEBUG nova.network.os_vif_util [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.759 187212 DEBUG os_vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.761 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.761 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf04237a-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.763 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.766 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.768 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.772 187212 INFO os_vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:f6:34,bridge_name='br-int',has_traffic_filtering=True,id=af04237a-1f79-4f68-a18e-1ceb4911605b,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf04237a-1f')#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.773 187212 DEBUG nova.virt.libvirt.vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1099990882',display_name='tempest-AttachInterfacesTestJSON-server-1099990882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1099990882',id=67,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGWyizi6axh3g1lqh3EssS8Rsy4cRJr2O9dnqAqiqeumCgJflOAzBLIArmZdzv3bF2muOe0KxCJTvAF8vGbOdDZdh1AZ+T+oHyUD1boLu7DnjEFnqoYggnqfVAdSxHRAbQ==',key_name='tempest-keypair-1730740858',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-ahzpuadl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=f1e72d05-87e7-495d-9dbb-1a10b112c69f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.773 187212 DEBUG nova.network.os_vif_util [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "08b15784-5374-4fb3-9f63-82412f709db4", "address": "fa:16:3e:d1:21:db", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08b15784-53", "ovs_interfaceid": "08b15784-5374-4fb3-9f63-82412f709db4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.774 187212 DEBUG nova.network.os_vif_util [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.775 187212 DEBUG os_vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.776 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.776 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08b15784-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.778 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.780 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.783 187212 INFO os_vif [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:21:db,bridge_name='br-int',has_traffic_filtering=True,id=08b15784-5374-4fb3-9f63-82412f709db4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap08b15784-53')#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.784 187212 INFO nova.virt.libvirt.driver [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Deleting instance files /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f_del#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.785 187212 INFO nova.virt.libvirt.driver [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Deletion of /var/lib/nova/instances/f1e72d05-87e7-495d-9dbb-1a10b112c69f_del complete#033[00m
Dec  5 07:09:19 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [NOTICE]   (228701) : haproxy version is 2.8.14-c23fe91
Dec  5 07:09:19 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [NOTICE]   (228701) : path to executable is /usr/sbin/haproxy
Dec  5 07:09:19 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [WARNING]  (228701) : Exiting Master process...
Dec  5 07:09:19 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [WARNING]  (228701) : Exiting Master process...
Dec  5 07:09:19 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [ALERT]    (228701) : Current worker (228703) exited with code 143 (Terminated)
Dec  5 07:09:19 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[228696]: [WARNING]  (228701) : All workers exited. Exiting... (0)
Dec  5 07:09:19 np0005546909 systemd[1]: libpod-9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b.scope: Deactivated successfully.
Dec  5 07:09:19 np0005546909 podman[230652]: 2025-12-05 12:09:19.85414553 +0000 UTC m=+0.088244118 container died 9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.865 187212 INFO nova.compute.manager [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.865 187212 DEBUG oslo.service.loopingcall [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.866 187212 DEBUG nova.compute.manager [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:09:19 np0005546909 nova_compute[187208]: 2025-12-05 12:09:19.866 187212 DEBUG nova.network.neutron [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:09:20 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b-userdata-shm.mount: Deactivated successfully.
Dec  5 07:09:20 np0005546909 systemd[1]: var-lib-containers-storage-overlay-186f8522ea6206e72ff71431d61cc132ff2e048571a25807429a54cf15a146be-merged.mount: Deactivated successfully.
Dec  5 07:09:20 np0005546909 podman[230652]: 2025-12-05 12:09:20.072838711 +0000 UTC m=+0.306937299 container cleanup 9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:09:20 np0005546909 podman[230682]: 2025-12-05 12:09:20.168088078 +0000 UTC m=+0.073211837 container remove 9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  5 07:09:20 np0005546909 systemd[1]: libpod-conmon-9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b.scope: Deactivated successfully.
Dec  5 07:09:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.174 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f95b74c5-264d-4149-853e-4ad3c049cbf6]: (4, ('Fri Dec  5 12:09:19 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c (9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b)\n9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b\nFri Dec  5 12:09:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c (9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b)\n9498211e05749d2207127ff12e351cf14c3479092f7b05203873c1fd2498c91b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.177 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[928b2675-7241-498c-a730-042ebe658f7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.179 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.181 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:20 np0005546909 kernel: tapfbfed6fc-30: left promiscuous mode
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.184 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.188 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe1bc8d-9823-4d45-ae67-2cb1a1168ea8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.207 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[15c3cc5c-2db5-4a95-8347-30133536f04d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.209 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b868d60a-e322-492e-96a6-ed5f352100ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.224 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e42669-7f70-44c0-9e7d-1634a937b8d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383474, 'reachable_time': 22757, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230695, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:20 np0005546909 systemd[1]: run-netns-ovnmeta\x2dfbfed6fc\x2d3701\x2d4311\x2da4c2\x2d8c49c5b7584c.mount: Deactivated successfully.
Dec  5 07:09:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.226 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:09:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:20.227 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[fea878ea-961a-4510-aade-50d07aea306b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.515 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 -F raw /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk" returned: 0 in 1.312s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.515 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.515 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Ensure instance console log exists: /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.516 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.516 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.516 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.518 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Start _get_guest_xml network_info=[{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='a6bda2862c2c7b673534b551216afe30',container_format='bare',created_at=2025-12-05T12:08:38Z,direct_url=<?>,disk_format='qcow2',id=4d0314d0-2208-4446-8d20-5c2197f0bd9d,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-1629320086-shelved',owner='58cbd93e463049988ccd6d013893e7d6',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-12-05T12:08:51Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.522 187212 WARNING nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.530 187212 DEBUG nova.virt.libvirt.host [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.531 187212 DEBUG nova.virt.libvirt.host [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.535 187212 DEBUG nova.virt.libvirt.host [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.535 187212 DEBUG nova.virt.libvirt.host [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.536 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.536 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='a6bda2862c2c7b673534b551216afe30',container_format='bare',created_at=2025-12-05T12:08:38Z,direct_url=<?>,disk_format='qcow2',id=4d0314d0-2208-4446-8d20-5c2197f0bd9d,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-1629320086-shelved',owner='58cbd93e463049988ccd6d013893e7d6',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-12-05T12:08:51Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.536 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.536 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.537 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.537 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.537 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.537 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.537 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.537 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.538 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.538 187212 DEBUG nova.virt.hardware [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.538 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.569 187212 DEBUG nova.virt.libvirt.vif [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629320086',display_name='tempest-ServerActionsTestOtherB-server-1629320086',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629320086',id=54,image_ref='4d0314d0-2208-4446-8d20-5c2197f0bd9d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-776546213',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-l59qc6ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member',shelved_at='2025-12-05T12:08:51.833545',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='4d0314d0-2208-4446-8d20-5c2197f0bd9d'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=24358eea-14fb-4863-a6c4-aadcdb495f54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.569 187212 DEBUG nova.network.os_vif_util [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.570 187212 DEBUG nova.network.os_vif_util [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.571 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.590 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  <uuid>24358eea-14fb-4863-a6c4-aadcdb495f54</uuid>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  <name>instance-00000036</name>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerActionsTestOtherB-server-1629320086</nova:name>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:09:20</nova:creationTime>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:        <nova:user uuid="4ad1281afc874c0ca55d908d3a6e05a8">tempest-ServerActionsTestOtherB-1759520420-project-member</nova:user>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:        <nova:project uuid="58cbd93e463049988ccd6d013893e7d6">tempest-ServerActionsTestOtherB-1759520420</nova:project>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="4d0314d0-2208-4446-8d20-5c2197f0bd9d"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:        <nova:port uuid="2e9efd6c-740c-405b-b9f0-bd46434070a7">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <entry name="serial">24358eea-14fb-4863-a6c4-aadcdb495f54</entry>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <entry name="uuid">24358eea-14fb-4863-a6c4-aadcdb495f54</entry>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:ab:5e:ef"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <target dev="tap2e9efd6c-74"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/console.log" append="off"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <input type="keyboard" bus="usb"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:09:20 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:09:20 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:09:20 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:09:20 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.590 187212 DEBUG nova.compute.manager [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Preparing to wait for external event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.591 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.591 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.591 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.592 187212 DEBUG nova.virt.libvirt.vif [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629320086',display_name='tempest-ServerActionsTestOtherB-server-1629320086',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629320086',id=54,image_ref='4d0314d0-2208-4446-8d20-5c2197f0bd9d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-776546213',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:05:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-l59qc6ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member',shelved_at='2025-12-05T12:08:51.833545',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='4d0314d0-2208-4446-8d20-5c2197f0bd9d'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=24358eea-14fb-4863-a6c4-aadcdb495f54,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.592 187212 DEBUG nova.network.os_vif_util [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.592 187212 DEBUG nova.network.os_vif_util [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.593 187212 DEBUG os_vif [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.593 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.593 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.594 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.596 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.596 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2e9efd6c-74, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.597 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2e9efd6c-74, col_values=(('external_ids', {'iface-id': '2e9efd6c-740c-405b-b9f0-bd46434070a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:5e:ef', 'vm-uuid': '24358eea-14fb-4863-a6c4-aadcdb495f54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.598 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:20 np0005546909 NetworkManager[55691]: <info>  [1764936560.5998] manager: (tap2e9efd6c-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.601 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.604 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.604 187212 INFO os_vif [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74')#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.677 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.677 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.678 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] No VIF found with MAC fa:16:3e:ab:5e:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.678 187212 INFO nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Using config drive#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.703 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:20 np0005546909 nova_compute[187208]: 2025-12-05 12:09:20.754 187212 DEBUG nova.objects.instance [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'keypairs' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.059 187212 DEBUG neutronclient.v2_0.client [-] Error message: {"NeutronError": {"type": "PortNotFound", "message": "Port 08b15784-5374-4fb3-9f63-82412f709db4 could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.060 187212 DEBUG nova.network.neutron [-] Unable to show port 08b15784-5374-4fb3-9f63-82412f709db4 as it no longer exists. _unbind_ports /usr/lib/python3.9/site-packages/nova/network/neutron.py:666#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.389 187212 DEBUG nova.compute.manager [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.390 187212 DEBUG oslo_concurrency.lockutils [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.390 187212 DEBUG oslo_concurrency.lockutils [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.390 187212 DEBUG oslo_concurrency.lockutils [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.390 187212 DEBUG nova.compute.manager [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Processing event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.391 187212 DEBUG nova.compute.manager [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.391 187212 DEBUG oslo_concurrency.lockutils [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.391 187212 DEBUG oslo_concurrency.lockutils [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.391 187212 DEBUG oslo_concurrency.lockutils [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.391 187212 DEBUG nova.compute.manager [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] No waiting events found dispatching network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.392 187212 WARNING nova.compute.manager [req-22ee85dc-c408-4fec-b02a-6e5536a3c999 req-3476baa4-d1dc-43bb-8aa4-344ae9bedeb0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received unexpected event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.392 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Instance event wait completed in 10 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.396 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936562.3965611, dbbad270-1e3c-41e1-9173-c1b9df0ab2dd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.397 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.398 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.402 187212 INFO nova.virt.libvirt.driver [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Instance spawned successfully.#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.403 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.417 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.423 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.426 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.426 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.427 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.427 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.427 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.428 187212 DEBUG nova.virt.libvirt.driver [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.465 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.504 187212 INFO nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Took 17.83 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.504 187212 DEBUG nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.546 187212 INFO nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Creating config drive at /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.552 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjoed27ec execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.595 187212 INFO nova.compute.manager [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Took 18.43 seconds to build instance.#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.681 187212 DEBUG oslo_concurrency.processutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjoed27ec" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:22 np0005546909 kernel: tap2e9efd6c-74: entered promiscuous mode
Dec  5 07:09:22 np0005546909 NetworkManager[55691]: <info>  [1764936562.7449] manager: (tap2e9efd6c-74): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Dec  5 07:09:22 np0005546909 systemd-udevd[230712]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:09:22 np0005546909 NetworkManager[55691]: <info>  [1764936562.7902] device (tap2e9efd6c-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:09:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:22Z|00677|binding|INFO|Claiming lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 for this chassis.
Dec  5 07:09:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:22Z|00678|binding|INFO|2e9efd6c-740c-405b-b9f0-bd46434070a7: Claiming fa:16:3e:ab:5e:ef 10.100.0.5
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.791 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.793 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:22 np0005546909 NetworkManager[55691]: <info>  [1764936562.7945] device (tap2e9efd6c-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:09:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:22Z|00679|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 ovn-installed in OVS
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.805 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:22 np0005546909 systemd-machined[153543]: New machine qemu-81-instance-00000036.
Dec  5 07:09:22 np0005546909 systemd[1]: Started Virtual Machine qemu-81-instance-00000036.
Dec  5 07:09:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.868 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:5e:ef 10.100.0.5'], port_security=['fa:16:3e:ab:5e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'def128bf-31aa-408f-b463-573b7d555296', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2e9efd6c-740c-405b-b9f0-bd46434070a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.870 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2e9efd6c-740c-405b-b9f0-bd46434070a7 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 bound to our chassis#033[00m
Dec  5 07:09:22 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:22Z|00680|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 up in Southbound
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.868 187212 DEBUG oslo_concurrency.lockutils [None req-4231ad14-0548-42e3-b6cf-0d29d2a5ab4d 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.872 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363#033[00m
Dec  5 07:09:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.890 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[89fe3388-30c0-41e6-a392-60fbee328dbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.921 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb12523-4759-41c6-a751-797f7204e07e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.925 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc3a648-890d-4c88-aaef-fe593f7574c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.957 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a46c9a96-4472-4f0c-b0e2-27542b86d63a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.977 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fef62758-73fd-4999-9baa-c40d0f9f3857]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 17, 'rx_bytes': 700, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 17, 'rx_bytes': 700, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 40806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230730, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:22 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.987 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.996 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dc72f154-13ea-478b-81fc-bf076a1b5ef7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230731, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230731, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:22 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:22.997 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:23.001 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:23.001 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:23.002 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:23 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:23.002 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:23 np0005546909 nova_compute[187208]: 2025-12-05 12:09:22.999 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:23 np0005546909 nova_compute[187208]: 2025-12-05 12:09:23.000 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:23 np0005546909 nova_compute[187208]: 2025-12-05 12:09:23.314 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:23 np0005546909 nova_compute[187208]: 2025-12-05 12:09:23.432 187212 DEBUG nova.network.neutron [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updated VIF entry in instance network info cache for port 2e9efd6c-740c-405b-b9f0-bd46434070a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:09:23 np0005546909 nova_compute[187208]: 2025-12-05 12:09:23.432 187212 DEBUG nova.network.neutron [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:23 np0005546909 nova_compute[187208]: 2025-12-05 12:09:23.467 187212 DEBUG oslo_concurrency.lockutils [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:09:23 np0005546909 nova_compute[187208]: 2025-12-05 12:09:23.467 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-deleted-08b15784-5374-4fb3-9f63-82412f709db4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:23 np0005546909 nova_compute[187208]: 2025-12-05 12:09:23.468 187212 INFO nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Neutron deleted interface 08b15784-5374-4fb3-9f63-82412f709db4; detaching it from the instance and deleting it from the info cache#033[00m
Dec  5 07:09:23 np0005546909 nova_compute[187208]: 2025-12-05 12:09:23.468 187212 DEBUG nova.network.neutron [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [{"id": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "address": "fa:16:3e:01:99:b0", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a6775e-6d", "ovs_interfaceid": "f7a6775e-6d9c-48e1-91d7-829a6f5f3742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "af04237a-1f79-4f68-a18e-1ceb4911605b", "address": "fa:16:3e:54:f6:34", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf04237a-1f", "ovs_interfaceid": "af04237a-1f79-4f68-a18e-1ceb4911605b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:23 np0005546909 nova_compute[187208]: 2025-12-05 12:09:23.928 187212 DEBUG nova.compute.manager [req-6c7c224d-f158-43d7-8cc2-e882072eca7b req-90acf398-788d-4b73-b05f-7ec05f68395b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Detach interface failed, port_id=08b15784-5374-4fb3-9f63-82412f709db4, reason: Instance f1e72d05-87e7-495d-9dbb-1a10b112c69f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Dec  5 07:09:24 np0005546909 podman[230733]: 2025-12-05 12:09:24.209176279 +0000 UTC m=+0.064763875 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.292 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936564.2920427, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.293 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Started (Lifecycle Event)#033[00m
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.354 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.359 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936564.2927058, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.359 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.387 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.391 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.486 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.691 187212 DEBUG nova.compute.manager [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.691 187212 DEBUG nova.compute.manager [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing instance network info cache due to event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.691 187212 DEBUG oslo_concurrency.lockutils [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.691 187212 DEBUG oslo_concurrency.lockutils [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:24 np0005546909 nova_compute[187208]: 2025-12-05 12:09:24.692 187212 DEBUG nova.network.neutron [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:09:25 np0005546909 nova_compute[187208]: 2025-12-05 12:09:25.599 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.020 187212 DEBUG nova.network.neutron [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.046 187212 INFO nova.compute.manager [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Took 7.18 seconds to deallocate network for instance.#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.096 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.096 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.278 187212 DEBUG nova.compute.provider_tree [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.296 187212 DEBUG nova.scheduler.client.report [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.322 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.357 187212 INFO nova.scheduler.client.report [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Deleted allocations for instance f1e72d05-87e7-495d-9dbb-1a10b112c69f#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.420 187212 DEBUG oslo_concurrency.lockutils [None req-15e7cef8-1b43-45a9-8ad3-393f024405a0 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.988s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.882 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-unplugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.883 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.883 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.884 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.884 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-unplugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.884 187212 WARNING nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-unplugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.884 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.885 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.885 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.885 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.885 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.886 187212 WARNING nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.886 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-unplugged-af04237a-1f79-4f68-a18e-1ceb4911605b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.886 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.887 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.887 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.887 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-unplugged-af04237a-1f79-4f68-a18e-1ceb4911605b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.887 187212 WARNING nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-unplugged-af04237a-1f79-4f68-a18e-1ceb4911605b for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.888 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.888 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.888 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.888 187212 DEBUG oslo_concurrency.lockutils [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "f1e72d05-87e7-495d-9dbb-1a10b112c69f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.889 187212 DEBUG nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] No waiting events found dispatching network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.889 187212 WARNING nova.compute.manager [req-42a9c1ad-7225-4798-ba47-df0353d08a95 req-08eba9b7-3b63-43ea-8f4a-4e7f566998a9 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received unexpected event network-vif-plugged-af04237a-1f79-4f68-a18e-1ceb4911605b for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.950 187212 DEBUG nova.network.neutron [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updated VIF entry in instance network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.951 187212 DEBUG nova.network.neutron [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.971 187212 DEBUG oslo_concurrency.lockutils [req-dad5e8b0-2d16-459a-a0ac-ba0066aae626 req-6de626b9-bfca-400d-b9ef-b0ccb3d519a4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:09:27 np0005546909 nova_compute[187208]: 2025-12-05 12:09:27.990 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.907 187212 DEBUG nova.compute.manager [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.908 187212 DEBUG oslo_concurrency.lockutils [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.908 187212 DEBUG oslo_concurrency.lockutils [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.908 187212 DEBUG oslo_concurrency.lockutils [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.909 187212 DEBUG nova.compute.manager [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Processing event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.909 187212 DEBUG nova.compute.manager [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-deleted-f7a6775e-6d9c-48e1-91d7-829a6f5f3742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.909 187212 DEBUG nova.compute.manager [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.909 187212 DEBUG oslo_concurrency.lockutils [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.910 187212 DEBUG oslo_concurrency.lockutils [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.910 187212 DEBUG oslo_concurrency.lockutils [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.910 187212 DEBUG nova.compute.manager [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.910 187212 WARNING nova.compute.manager [req-9324494d-803c-40b3-9ea5-f208052b91a2 req-3cea4388-d3ca-4c55-a74f-108cb17fd2ef 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.911 187212 DEBUG nova.compute.manager [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.915 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936569.9148142, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.916 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.921 187212 DEBUG nova.virt.libvirt.driver [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.924 187212 INFO nova.virt.libvirt.driver [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance spawned successfully.#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.971 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:29 np0005546909 nova_compute[187208]: 2025-12-05 12:09:29.975 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:09:30 np0005546909 nova_compute[187208]: 2025-12-05 12:09:30.432 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:09:30 np0005546909 nova_compute[187208]: 2025-12-05 12:09:30.601 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:32 np0005546909 podman[230766]: 2025-12-05 12:09:32.221909611 +0000 UTC m=+0.063529799 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec  5 07:09:32 np0005546909 podman[230765]: 2025-12-05 12:09:32.273628512 +0000 UTC m=+0.114389526 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec  5 07:09:32 np0005546909 nova_compute[187208]: 2025-12-05 12:09:32.312 187212 DEBUG nova.compute.manager [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:32 np0005546909 nova_compute[187208]: 2025-12-05 12:09:32.387 187212 DEBUG oslo_concurrency.lockutils [None req-e1b667d1-f0b0-4643-9e42-d0d417db1866 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 23.975s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:32 np0005546909 nova_compute[187208]: 2025-12-05 12:09:32.993 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:33 np0005546909 nova_compute[187208]: 2025-12-05 12:09:33.010 187212 DEBUG nova.compute.manager [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Received event network-vif-deleted-af04237a-1f79-4f68-a18e-1ceb4911605b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:33 np0005546909 nova_compute[187208]: 2025-12-05 12:09:33.011 187212 DEBUG nova.compute.manager [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:33 np0005546909 nova_compute[187208]: 2025-12-05 12:09:33.011 187212 DEBUG nova.compute.manager [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing instance network info cache due to event network-changed-11c7fa90-6a48-487a-a375-5adf7f41cb90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:09:33 np0005546909 nova_compute[187208]: 2025-12-05 12:09:33.011 187212 DEBUG oslo_concurrency.lockutils [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:33 np0005546909 nova_compute[187208]: 2025-12-05 12:09:33.011 187212 DEBUG oslo_concurrency.lockutils [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:33 np0005546909 nova_compute[187208]: 2025-12-05 12:09:33.011 187212 DEBUG nova.network.neutron [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Refreshing network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:09:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:34Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:60:68:ad 10.100.0.4
Dec  5 07:09:34 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:34Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:60:68:ad 10.100.0.4
Dec  5 07:09:34 np0005546909 nova_compute[187208]: 2025-12-05 12:09:34.722 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936559.720632, f1e72d05-87e7-495d-9dbb-1a10b112c69f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:09:34 np0005546909 nova_compute[187208]: 2025-12-05 12:09:34.722 187212 INFO nova.compute.manager [-] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:09:34 np0005546909 nova_compute[187208]: 2025-12-05 12:09:34.742 187212 DEBUG nova.compute.manager [None req-ef119c1b-8bb9-4025-b750-736eb17d7e66 - - - - - -] [instance: f1e72d05-87e7-495d-9dbb-1a10b112c69f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:35 np0005546909 nova_compute[187208]: 2025-12-05 12:09:35.430 187212 DEBUG nova.network.neutron [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updated VIF entry in instance network info cache for port 11c7fa90-6a48-487a-a375-5adf7f41cb90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:09:35 np0005546909 nova_compute[187208]: 2025-12-05 12:09:35.431 187212 DEBUG nova.network.neutron [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [{"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:35 np0005546909 nova_compute[187208]: 2025-12-05 12:09:35.449 187212 DEBUG oslo_concurrency.lockutils [req-fc554c09-5efb-4e83-a16b-dc39635b51d6 req-415b5f99-b0e9-489c-9876-f613f7011c4a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-2e537618-f998-4c4d-8e1e-e9cc79219330" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:09:35 np0005546909 nova_compute[187208]: 2025-12-05 12:09:35.525 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:35 np0005546909 nova_compute[187208]: 2025-12-05 12:09:35.602 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.252 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.252 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.253 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.253 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.253 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.254 187212 INFO nova.compute.manager [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Terminating instance#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.256 187212 DEBUG nova.compute.manager [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:09:36 np0005546909 kernel: tapdf4eecd2-b2 (unregistering): left promiscuous mode
Dec  5 07:09:36 np0005546909 NetworkManager[55691]: <info>  [1764936576.2822] device (tapdf4eecd2-b2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:09:36 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:36Z|00681|binding|INFO|Releasing lport df4eecd2-b2e2-445a-acac-232f66123555 from this chassis (sb_readonly=0)
Dec  5 07:09:36 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:36Z|00682|binding|INFO|Setting lport df4eecd2-b2e2-445a-acac-232f66123555 down in Southbound
Dec  5 07:09:36 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:36Z|00683|binding|INFO|Removing iface tapdf4eecd2-b2 ovn-installed in OVS
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.303 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.305 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.318 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:36 np0005546909 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d00000047.scope: Deactivated successfully.
Dec  5 07:09:36 np0005546909 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d00000047.scope: Consumed 15.391s CPU time.
Dec  5 07:09:36 np0005546909 systemd-machined[153543]: Machine qemu-78-instance-00000047 terminated.
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.351 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:3b:49 10.100.0.11'], port_security=['fa:16:3e:40:3b:49 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b235a96f-7a12-4bd2-8627-33b128346aa4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bf30ed1956544c7eae67c989042126e4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c4ee2104-41f1-480e-ab3a-db882b9c2d98', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bb90128-3616-41a6-a999-156ce64fbcf7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=df4eecd2-b2e2-445a-acac-232f66123555) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.352 104471 INFO neutron.agent.ovn.metadata.agent [-] Port df4eecd2-b2e2-445a-acac-232f66123555 in datapath 02d8cc87-efdf-4db2-b7ab-393e2480966a unbound from our chassis#033[00m
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.355 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 02d8cc87-efdf-4db2-b7ab-393e2480966a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.356 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eeb6cb17-1751-4271-a9bc-67c56a4bbc0c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:36 np0005546909 podman[230816]: 2025-12-05 12:09:36.357494037 +0000 UTC m=+0.052751711 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.357 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a namespace which is not needed anymore#033[00m
Dec  5 07:09:36 np0005546909 podman[230819]: 2025-12-05 12:09:36.40859399 +0000 UTC m=+0.104048540 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:09:36 np0005546909 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [NOTICE]   (229739) : haproxy version is 2.8.14-c23fe91
Dec  5 07:09:36 np0005546909 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [NOTICE]   (229739) : path to executable is /usr/sbin/haproxy
Dec  5 07:09:36 np0005546909 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [WARNING]  (229739) : Exiting Master process...
Dec  5 07:09:36 np0005546909 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [WARNING]  (229739) : Exiting Master process...
Dec  5 07:09:36 np0005546909 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [ALERT]    (229739) : Current worker (229741) exited with code 143 (Terminated)
Dec  5 07:09:36 np0005546909 neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a[229735]: [WARNING]  (229739) : All workers exited. Exiting... (0)
Dec  5 07:09:36 np0005546909 systemd[1]: libpod-44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1.scope: Deactivated successfully.
Dec  5 07:09:36 np0005546909 conmon[229735]: conmon 44e956a7e8f863954c3e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1.scope/container/memory.events
Dec  5 07:09:36 np0005546909 podman[230887]: 2025-12-05 12:09:36.490105724 +0000 UTC m=+0.047218283 container died 44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.521 187212 INFO nova.virt.libvirt.driver [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Instance destroyed successfully.#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.523 187212 DEBUG nova.objects.instance [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lazy-loading 'resources' on Instance uuid b235a96f-7a12-4bd2-8627-33b128346aa4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:36 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1-userdata-shm.mount: Deactivated successfully.
Dec  5 07:09:36 np0005546909 systemd[1]: var-lib-containers-storage-overlay-3936c809ee6a8001892b1f5e8b230731bdba206d1aa032e18836cd92f8d64675-merged.mount: Deactivated successfully.
Dec  5 07:09:36 np0005546909 podman[230887]: 2025-12-05 12:09:36.541375632 +0000 UTC m=+0.098488201 container cleanup 44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:09:36 np0005546909 systemd[1]: libpod-conmon-44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1.scope: Deactivated successfully.
Dec  5 07:09:36 np0005546909 podman[230933]: 2025-12-05 12:09:36.608799722 +0000 UTC m=+0.045474682 container remove 44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.612 187212 DEBUG nova.virt.libvirt.vif [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:08:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-959694714',display_name='tempest-ServerMetadataNegativeTestJSON-server-959694714',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-959694714',id=71,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:08:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bf30ed1956544c7eae67c989042126e4',ramdisk_id='',reservation_id='r-h9tu7hr7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataNegativeTestJSON-91345283',owner_user_name='tempest-ServerMetadataNegativeTestJSON-91345283-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:08:29Z,user_data=None,user_id='132d581de02e49b9a4c99b9b831dd5b5',uuid=b235a96f-7a12-4bd2-8627-33b128346aa4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.613 187212 DEBUG nova.network.os_vif_util [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Converting VIF {"id": "df4eecd2-b2e2-445a-acac-232f66123555", "address": "fa:16:3e:40:3b:49", "network": {"id": "02d8cc87-efdf-4db2-b7ab-393e2480966a", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-34320108-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bf30ed1956544c7eae67c989042126e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf4eecd2-b2", "ovs_interfaceid": "df4eecd2-b2e2-445a-acac-232f66123555", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.615 187212 DEBUG nova.network.os_vif_util [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.614 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[579a75fd-39ea-433a-b4e0-9726482adab0]: (4, ('Fri Dec  5 12:09:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a (44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1)\n44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1\nFri Dec  5 12:09:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a (44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1)\n44e956a7e8f863954c3e3f31f95d638476f1370fd991790a462e0cc31d10a7d1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.615 187212 DEBUG os_vif [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.616 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cc020c01-ca06-4e21-b86e-1e3927360c2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.618 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.618 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap02d8cc87-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.618 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf4eecd2-b2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.620 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.623 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:09:36 np0005546909 kernel: tap02d8cc87-e0: left promiscuous mode
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.628 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.631 187212 INFO os_vif [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:40:3b:49,bridge_name='br-int',has_traffic_filtering=True,id=df4eecd2-b2e2-445a-acac-232f66123555,network=Network(02d8cc87-efdf-4db2-b7ab-393e2480966a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf4eecd2-b2')#033[00m
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.632 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ed47019f-749e-4ea8-b5c0-e85bb45a3b77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.632 187212 INFO nova.virt.libvirt.driver [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Deleting instance files /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4_del#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.633 187212 INFO nova.virt.libvirt.driver [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Deletion of /var/lib/nova/instances/b235a96f-7a12-4bd2-8627-33b128346aa4_del complete#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.650 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.651 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5a738836-adbb-4c42-85f4-148fbd612952]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.655 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8d81d538-20ba-4060-a9c7-1b9fe7427fa8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.673 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3de205b6-bee7-493e-9ac9-21a7905c0588]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388441, 'reachable_time': 36104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230947, 'error': None, 'target': 'ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:36 np0005546909 systemd[1]: run-netns-ovnmeta\x2d02d8cc87\x2defdf\x2d4db2\x2db7ab\x2d393e2480966a.mount: Deactivated successfully.
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.677 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-02d8cc87-efdf-4db2-b7ab-393e2480966a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:09:36 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:36.678 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb4c171-481c-4b5f-ba64-983632495f10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.728 187212 INFO nova.compute.manager [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.729 187212 DEBUG oslo.service.loopingcall [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.729 187212 DEBUG nova.compute.manager [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:09:36 np0005546909 nova_compute[187208]: 2025-12-05 12:09:36.729 187212 DEBUG nova.network.neutron [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:09:37 np0005546909 nova_compute[187208]: 2025-12-05 12:09:37.877 187212 DEBUG nova.network.neutron [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:37 np0005546909 nova_compute[187208]: 2025-12-05 12:09:37.902 187212 INFO nova.compute.manager [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Took 1.17 seconds to deallocate network for instance.#033[00m
Dec  5 07:09:37 np0005546909 nova_compute[187208]: 2025-12-05 12:09:37.943 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:37 np0005546909 nova_compute[187208]: 2025-12-05 12:09:37.943 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:38 np0005546909 nova_compute[187208]: 2025-12-05 12:09:38.034 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:38 np0005546909 nova_compute[187208]: 2025-12-05 12:09:38.094 187212 DEBUG nova.compute.provider_tree [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:09:38 np0005546909 nova_compute[187208]: 2025-12-05 12:09:38.112 187212 DEBUG nova.scheduler.client.report [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:09:38 np0005546909 nova_compute[187208]: 2025-12-05 12:09:38.131 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:38 np0005546909 nova_compute[187208]: 2025-12-05 12:09:38.165 187212 INFO nova.scheduler.client.report [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Deleted allocations for instance b235a96f-7a12-4bd2-8627-33b128346aa4#033[00m
Dec  5 07:09:38 np0005546909 nova_compute[187208]: 2025-12-05 12:09:38.243 187212 DEBUG oslo_concurrency.lockutils [None req-9845714e-5261-4d2c-b3c8-b27c750a9eec 132d581de02e49b9a4c99b9b831dd5b5 bf30ed1956544c7eae67c989042126e4 - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.991s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:40 np0005546909 podman[230948]: 2025-12-05 12:09:40.219607443 +0000 UTC m=+0.070639813 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.509 187212 DEBUG nova.compute.manager [req-2600716f-f3bd-4b0f-b0cd-0ef40a2bca76 req-8c470da2-6c41-4539-96e9-5bce7b96f2c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received event network-vif-unplugged-df4eecd2-b2e2-445a-acac-232f66123555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.510 187212 DEBUG oslo_concurrency.lockutils [req-2600716f-f3bd-4b0f-b0cd-0ef40a2bca76 req-8c470da2-6c41-4539-96e9-5bce7b96f2c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.511 187212 DEBUG oslo_concurrency.lockutils [req-2600716f-f3bd-4b0f-b0cd-0ef40a2bca76 req-8c470da2-6c41-4539-96e9-5bce7b96f2c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.511 187212 DEBUG oslo_concurrency.lockutils [req-2600716f-f3bd-4b0f-b0cd-0ef40a2bca76 req-8c470da2-6c41-4539-96e9-5bce7b96f2c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.511 187212 DEBUG nova.compute.manager [req-2600716f-f3bd-4b0f-b0cd-0ef40a2bca76 req-8c470da2-6c41-4539-96e9-5bce7b96f2c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] No waiting events found dispatching network-vif-unplugged-df4eecd2-b2e2-445a-acac-232f66123555 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.511 187212 WARNING nova.compute.manager [req-2600716f-f3bd-4b0f-b0cd-0ef40a2bca76 req-8c470da2-6c41-4539-96e9-5bce7b96f2c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received unexpected event network-vif-unplugged-df4eecd2-b2e2-445a-acac-232f66123555 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.715 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.716 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.716 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.716 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.717 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.718 187212 INFO nova.compute.manager [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Terminating instance#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.719 187212 DEBUG nova.compute.manager [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:09:40 np0005546909 kernel: tap11c7fa90-6a (unregistering): left promiscuous mode
Dec  5 07:09:40 np0005546909 NetworkManager[55691]: <info>  [1764936580.7484] device (tap11c7fa90-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:09:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:40Z|00684|binding|INFO|Releasing lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 from this chassis (sb_readonly=0)
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.757 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:40Z|00685|binding|INFO|Setting lport 11c7fa90-6a48-487a-a375-5adf7f41cb90 down in Southbound
Dec  5 07:09:40 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:40Z|00686|binding|INFO|Removing iface tap11c7fa90-6a ovn-installed in OVS
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.759 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:40.766 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:ee:e4 10.100.0.2'], port_security=['fa:16:3e:e4:ee:e4 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '2e537618-f998-4c4d-8e1e-e9cc79219330', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-034629ef-6cd1-463c-b963-3d0d9c530038', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e846fccb774e44f585d8847897bc4229', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a77f7593-d6d1-44fb-8125-66cdfc38709c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a4c9d894-0fc3-4aad-a4d5-6bee101a530c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=11c7fa90-6a48-487a-a375-5adf7f41cb90) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:40.767 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 11c7fa90-6a48-487a-a375-5adf7f41cb90 in datapath 034629ef-6cd1-463c-b963-3d0d9c530038 unbound from our chassis#033[00m
Dec  5 07:09:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:40.769 104471 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 034629ef-6cd1-463c-b963-3d0d9c530038 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.771 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:40 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:40.770 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b8897bcf-300c-414f-bb59-baacb5cc9fdb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:40 np0005546909 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d00000046.scope: Deactivated successfully.
Dec  5 07:09:40 np0005546909 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d00000046.scope: Consumed 14.338s CPU time.
Dec  5 07:09:40 np0005546909 systemd-machined[153543]: Machine qemu-79-instance-00000046 terminated.
Dec  5 07:09:40 np0005546909 nova_compute[187208]: 2025-12-05 12:09:40.948 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.008 187212 INFO nova.virt.libvirt.driver [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Instance destroyed successfully.#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.009 187212 DEBUG nova.objects.instance [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lazy-loading 'resources' on Instance uuid 2e537618-f998-4c4d-8e1e-e9cc79219330 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.030 187212 DEBUG nova.virt.libvirt.vif [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-1436335913',display_name='tempest-ServerRescueTestJSONUnderV235-server-1436335913',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-1436335913',id=70,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:08:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e846fccb774e44f585d8847897bc4229',ramdisk_id='',reservation_id='r-230fx5t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-1035500959',owner_user_name='tempest-ServerRescueTestJSONUnderV235-1035500959-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:08:47Z,user_data=None,user_id='6a2cefdbcaae4db3b3ece95c8227d77e',uuid=2e537618-f998-4c4d-8e1e-e9cc79219330,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.031 187212 DEBUG nova.network.os_vif_util [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converting VIF {"id": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "address": "fa:16:3e:e4:ee:e4", "network": {"id": "034629ef-6cd1-463c-b963-3d0d9c530038", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1567734014-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e846fccb774e44f585d8847897bc4229", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c7fa90-6a", "ovs_interfaceid": "11c7fa90-6a48-487a-a375-5adf7f41cb90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.032 187212 DEBUG nova.network.os_vif_util [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.032 187212 DEBUG os_vif [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.034 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.035 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11c7fa90-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.038 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.040 187212 INFO os_vif [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:ee:e4,bridge_name='br-int',has_traffic_filtering=True,id=11c7fa90-6a48-487a-a375-5adf7f41cb90,network=Network(034629ef-6cd1-463c-b963-3d0d9c530038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c7fa90-6a')#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.041 187212 INFO nova.virt.libvirt.driver [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Deleting instance files /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330_del#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.042 187212 INFO nova.virt.libvirt.driver [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Deletion of /var/lib/nova/instances/2e537618-f998-4c4d-8e1e-e9cc79219330_del complete#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.049 187212 DEBUG nova.compute.manager [req-b15de738-2f3c-4d1c-bddf-7553d253060b req-be1941e0-b23c-45c2-a687-b66bda90f152 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received event network-vif-deleted-df4eecd2-b2e2-445a-acac-232f66123555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.075 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.075 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.076 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.101 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.111 187212 INFO nova.compute.manager [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.111 187212 DEBUG oslo.service.loopingcall [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.112 187212 DEBUG nova.compute.manager [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.112 187212 DEBUG nova.network.neutron [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.335 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.336 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.336 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.336 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.513 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.513 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.536 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.678 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.678 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.724 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.725 187212 INFO nova.compute.claims [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.921 187212 DEBUG nova.compute.provider_tree [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.941 187212 DEBUG nova.scheduler.client.report [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.985 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.307s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:41 np0005546909 nova_compute[187208]: 2025-12-05 12:09:41.986 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.058 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.058 187212 DEBUG nova.network.neutron [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.082 187212 INFO nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.099 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:09:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:42Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ab:5e:ef 10.100.0.5
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.201 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.202 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.203 187212 INFO nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Creating image(s)#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.204 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.204 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.205 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.231 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.292 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.293 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.294 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.306 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.363 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.364 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.619 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk 1073741824" returned: 0 in 0.255s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.621 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.621 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.680 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.681 187212 DEBUG nova.virt.disk.api [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Checking if we can resize image /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.681 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.703 187212 DEBUG nova.policy [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef254bb2df0442c6bcadfb3a6861c0e9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e836357870d746e49bc783da7cd3accd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.739 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.740 187212 DEBUG nova.virt.disk.api [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Cannot resize image /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.741 187212 DEBUG nova.objects.instance [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'migration_context' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.758 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.759 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Ensure instance console log exists: /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.759 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.760 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:42 np0005546909 nova_compute[187208]: 2025-12-05 12:09:42.760 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:43 np0005546909 nova_compute[187208]: 2025-12-05 12:09:43.036 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:43 np0005546909 nova_compute[187208]: 2025-12-05 12:09:43.363 187212 DEBUG nova.network.neutron [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:43 np0005546909 nova_compute[187208]: 2025-12-05 12:09:43.384 187212 INFO nova.compute.manager [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Took 2.27 seconds to deallocate network for instance.#033[00m
Dec  5 07:09:43 np0005546909 nova_compute[187208]: 2025-12-05 12:09:43.427 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:43 np0005546909 nova_compute[187208]: 2025-12-05 12:09:43.428 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:43 np0005546909 nova_compute[187208]: 2025-12-05 12:09:43.566 187212 DEBUG nova.compute.provider_tree [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:09:43 np0005546909 nova_compute[187208]: 2025-12-05 12:09:43.579 187212 DEBUG nova.scheduler.client.report [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:09:43 np0005546909 nova_compute[187208]: 2025-12-05 12:09:43.601 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:43 np0005546909 nova_compute[187208]: 2025-12-05 12:09:43.633 187212 INFO nova.scheduler.client.report [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Deleted allocations for instance 2e537618-f998-4c4d-8e1e-e9cc79219330#033[00m
Dec  5 07:09:43 np0005546909 nova_compute[187208]: 2025-12-05 12:09:43.736 187212 DEBUG oslo_concurrency.lockutils [None req-6494100f-bd4a-4d41-a035-c41296c8643e 6a2cefdbcaae4db3b3ece95c8227d77e e846fccb774e44f585d8847897bc4229 - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.020s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.570 187212 DEBUG nova.compute.manager [req-bd15d75f-20a0-4a37-a3ed-ab45d727f7f6 req-d9736480-dc64-4398-b292-0ff4dfd44761 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.570 187212 DEBUG oslo_concurrency.lockutils [req-bd15d75f-20a0-4a37-a3ed-ab45d727f7f6 req-d9736480-dc64-4398-b292-0ff4dfd44761 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.570 187212 DEBUG oslo_concurrency.lockutils [req-bd15d75f-20a0-4a37-a3ed-ab45d727f7f6 req-d9736480-dc64-4398-b292-0ff4dfd44761 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.571 187212 DEBUG oslo_concurrency.lockutils [req-bd15d75f-20a0-4a37-a3ed-ab45d727f7f6 req-d9736480-dc64-4398-b292-0ff4dfd44761 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b235a96f-7a12-4bd2-8627-33b128346aa4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.571 187212 DEBUG nova.compute.manager [req-bd15d75f-20a0-4a37-a3ed-ab45d727f7f6 req-d9736480-dc64-4398-b292-0ff4dfd44761 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] No waiting events found dispatching network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.571 187212 WARNING nova.compute.manager [req-bd15d75f-20a0-4a37-a3ed-ab45d727f7f6 req-d9736480-dc64-4398-b292-0ff4dfd44761 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Received unexpected event network-vif-plugged-df4eecd2-b2e2-445a-acac-232f66123555 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.633 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [{"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.675 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-24358eea-14fb-4863-a6c4-aadcdb495f54" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.675 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.676 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.677 187212 DEBUG nova.compute.manager [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-unplugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.678 187212 DEBUG oslo_concurrency.lockutils [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.678 187212 DEBUG oslo_concurrency.lockutils [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.678 187212 DEBUG oslo_concurrency.lockutils [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.678 187212 DEBUG nova.compute.manager [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-unplugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.679 187212 WARNING nova.compute.manager [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-unplugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.679 187212 DEBUG nova.compute.manager [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.679 187212 DEBUG oslo_concurrency.lockutils [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.679 187212 DEBUG oslo_concurrency.lockutils [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.679 187212 DEBUG oslo_concurrency.lockutils [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2e537618-f998-4c4d-8e1e-e9cc79219330-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.680 187212 DEBUG nova.compute.manager [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] No waiting events found dispatching network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.680 187212 WARNING nova.compute.manager [req-84a36c4e-a5ee-4416-8376-c62aa703c274 req-88eca027-adaa-4f09-b246-2cfa1073ce1f 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received unexpected event network-vif-plugged-11c7fa90-6a48-487a-a375-5adf7f41cb90 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.680 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.681 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.681 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.681 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.766 187212 DEBUG nova.network.neutron [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Successfully created port: 7370bdd5-ddf8-40de-9f35-975b8ceab3ef _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.874 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.875 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.875 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.875 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.875 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.876 187212 INFO nova.compute.manager [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Terminating instance#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.877 187212 DEBUG nova.compute.manager [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:09:44 np0005546909 kernel: tap1b4ab157-dd (unregistering): left promiscuous mode
Dec  5 07:09:44 np0005546909 NetworkManager[55691]: <info>  [1764936584.9106] device (tap1b4ab157-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:09:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:44Z|00687|binding|INFO|Releasing lport 1b4ab157-ddea-449c-ab91-983a53dd2045 from this chassis (sb_readonly=0)
Dec  5 07:09:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:44Z|00688|binding|INFO|Setting lport 1b4ab157-ddea-449c-ab91-983a53dd2045 down in Southbound
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:44Z|00689|binding|INFO|Removing iface tap1b4ab157-dd ovn-installed in OVS
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.921 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:44.926 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:e5:0a 10.100.0.13'], port_security=['fa:16:3e:03:e5:0a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '854e3893-3908-4b4a-b29c-7fb4384e4f0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'df1c03c3-b3c9-47b6-a712-a13948dd510e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=1b4ab157-ddea-449c-ab91-983a53dd2045) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:44.927 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 1b4ab157-ddea-449c-ab91-983a53dd2045 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 unbound from our chassis#033[00m
Dec  5 07:09:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:44.930 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5c17e5c-2b6c-48d3-9992-ac34070e3363#033[00m
Dec  5 07:09:44 np0005546909 nova_compute[187208]: 2025-12-05 12:09:44.931 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:44.952 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3530f19a-efbc-42e1-847b-b383f880f06b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:44 np0005546909 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000041.scope: Deactivated successfully.
Dec  5 07:09:44 np0005546909 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000041.scope: Consumed 18.742s CPU time.
Dec  5 07:09:44 np0005546909 systemd-machined[153543]: Machine qemu-70-instance-00000041 terminated.
Dec  5 07:09:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:44.999 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[192eb20c-d4f2-4cee-aa17-457f3e71fe9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.002 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[60a4dc31-398e-4ff7-8fd2-e335c14bbfb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.033 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[89d62454-02bd-45d1-952c-c5ecb5474b6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.049 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7dd8ba68-6d28-46bd-b39c-e60befb88627]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5c17e5c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:42:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 19, 'rx_bytes': 700, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 19, 'rx_bytes': 700, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 124], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371901, 'reachable_time': 40806, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231030, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:09:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.064 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f8bd43ca-7033-4fe0-9582-a62abfa03dc0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371911, 'tstamp': 371911}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231031, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb5c17e5c-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371914, 'tstamp': 371914}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231031, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.066 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.067 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.070 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.071 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5c17e5c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.071 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.071 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5c17e5c-20, col_values=(('external_ids', {'iface-id': 'bd03d3c4-09a9-42b5-bfad-4c02aa2d9ac5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:45.072 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.079 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.079 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.080 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.080 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.097 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.147 187212 INFO nova.virt.libvirt.driver [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Instance destroyed successfully.#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.148 187212 DEBUG nova.objects.instance [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'resources' on Instance uuid 854e3893-3908-4b4a-b29c-7fb4384e4f0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.159 187212 DEBUG nova.virt.libvirt.vif [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:07:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-63085993',display_name='tempest-ServerActionsTestOtherB-server-63085993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-63085993',id=65,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:07:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-ruwsmmgi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:07:20Z,user_data=None,user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=854e3893-3908-4b4a-b29c-7fb4384e4f0c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.160 187212 DEBUG nova.network.os_vif_util [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "1b4ab157-ddea-449c-ab91-983a53dd2045", "address": "fa:16:3e:03:e5:0a", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b4ab157-dd", "ovs_interfaceid": "1b4ab157-ddea-449c-ab91-983a53dd2045", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.161 187212 DEBUG nova.network.os_vif_util [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.161 187212 DEBUG os_vif [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.162 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.163 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b4ab157-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.169 187212 INFO os_vif [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:e5:0a,bridge_name='br-int',has_traffic_filtering=True,id=1b4ab157-ddea-449c-ab91-983a53dd2045,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b4ab157-dd')#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.170 187212 INFO nova.virt.libvirt.driver [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Deleting instance files /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c_del#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.170 187212 INFO nova.virt.libvirt.driver [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Deletion of /var/lib/nova/instances/854e3893-3908-4b4a-b29c-7fb4384e4f0c_del complete#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.404 187212 INFO nova.compute.manager [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Took 0.53 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.404 187212 DEBUG oslo.service.loopingcall [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.405 187212 DEBUG nova.compute.manager [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.405 187212 DEBUG nova.network.neutron [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.419 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.486 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.487 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.543 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.547 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Error from libvirt while getting description of instance-00000041: [Error Code 42] Domain not found: no domain with matching uuid '854e3893-3908-4b4a-b29c-7fb4384e4f0c' (instance-00000041): libvirt.libvirtError: Domain not found: no domain with matching uuid '854e3893-3908-4b4a-b29c-7fb4384e4f0c' (instance-00000041)#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.553 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.622 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.623 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.690 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.880 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.881 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5297MB free_disk=73.02593994140625GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.882 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:45 np0005546909 nova_compute[187208]: 2025-12-05 12:09:45.882 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:46 np0005546909 nova_compute[187208]: 2025-12-05 12:09:46.068 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:46 np0005546909 nova_compute[187208]: 2025-12-05 12:09:46.069 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:46 np0005546909 nova_compute[187208]: 2025-12-05 12:09:46.195 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 854e3893-3908-4b4a-b29c-7fb4384e4f0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:09:46 np0005546909 nova_compute[187208]: 2025-12-05 12:09:46.196 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance dbbad270-1e3c-41e1-9173-c1b9df0ab2dd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:09:46 np0005546909 nova_compute[187208]: 2025-12-05 12:09:46.196 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 24358eea-14fb-4863-a6c4-aadcdb495f54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:09:46 np0005546909 nova_compute[187208]: 2025-12-05 12:09:46.196 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 159b5354-c124-484f-a8ec-da1abf719114 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:09:46 np0005546909 nova_compute[187208]: 2025-12-05 12:09:46.479 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 54d9605a-998b-4492-afc8-f7a5b0dd4e84 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Dec  5 07:09:46 np0005546909 nova_compute[187208]: 2025-12-05 12:09:46.480 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:09:46 np0005546909 nova_compute[187208]: 2025-12-05 12:09:46.480 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1024MB phys_disk=79GB used_disk=4GB total_vcpus=8 used_vcpus=4 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:09:46 np0005546909 nova_compute[187208]: 2025-12-05 12:09:46.561 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:09:46 np0005546909 nova_compute[187208]: 2025-12-05 12:09:46.600 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:09:46 np0005546909 nova_compute[187208]: 2025-12-05 12:09:46.914 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:09:47 np0005546909 nova_compute[187208]: 2025-12-05 12:09:47.086 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:09:47 np0005546909 nova_compute[187208]: 2025-12-05 12:09:47.086 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:47 np0005546909 nova_compute[187208]: 2025-12-05 12:09:47.108 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:47 np0005546909 nova_compute[187208]: 2025-12-05 12:09:47.109 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:47 np0005546909 nova_compute[187208]: 2025-12-05 12:09:47.119 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:09:47 np0005546909 nova_compute[187208]: 2025-12-05 12:09:47.119 187212 INFO nova.compute.claims [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:09:47 np0005546909 nova_compute[187208]: 2025-12-05 12:09:47.646 187212 DEBUG nova.compute.provider_tree [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.003 187212 DEBUG nova.scheduler.client.report [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.027 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.918s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.028 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.037 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.082 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.082 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.157 187212 DEBUG nova.compute.manager [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Received event network-vif-deleted-11c7fa90-6a48-487a-a375-5adf7f41cb90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.157 187212 DEBUG nova.compute.manager [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-vif-unplugged-1b4ab157-ddea-449c-ab91-983a53dd2045 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.158 187212 DEBUG oslo_concurrency.lockutils [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.158 187212 DEBUG oslo_concurrency.lockutils [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.158 187212 DEBUG oslo_concurrency.lockutils [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.158 187212 DEBUG nova.compute.manager [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] No waiting events found dispatching network-vif-unplugged-1b4ab157-ddea-449c-ab91-983a53dd2045 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.158 187212 DEBUG nova.compute.manager [req-baadf2c3-487d-4086-9260-a0bf65a6877f req-835e9bd8-8cb7-4748-8939-d8356aa633f8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-vif-unplugged-1b4ab157-ddea-449c-ab91-983a53dd2045 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.183 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.183 187212 DEBUG nova.network.neutron [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.185 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.185 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:09:48 np0005546909 podman[231062]: 2025-12-05 12:09:48.21015165 +0000 UTC m=+0.062522201 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.215 187212 INFO nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.254 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.527 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.529 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.529 187212 INFO nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Creating image(s)#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.530 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.530 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.531 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.548 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.613 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.614 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.615 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.626 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.688 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.689 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.728 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.729 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.729 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.789 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.791 187212 DEBUG nova.virt.disk.api [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Checking if we can resize image /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.791 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.855 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.856 187212 DEBUG nova.virt.disk.api [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Cannot resize image /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.857 187212 DEBUG nova.objects.instance [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'migration_context' on Instance uuid 54d9605a-998b-4492-afc8-f7a5b0dd4e84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.902 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.903 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Ensure instance console log exists: /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.903 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.903 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:48 np0005546909 nova_compute[187208]: 2025-12-05 12:09:48.904 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.165 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:50.300 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.301 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:50.301 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:09:50 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:50.302 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.305 187212 DEBUG nova.network.neutron [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.325 187212 INFO nova.compute.manager [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Took 4.92 seconds to deallocate network for instance.#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.376 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.376 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.397 187212 DEBUG nova.policy [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.524 187212 DEBUG nova.compute.provider_tree [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.538 187212 DEBUG nova.scheduler.client.report [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.567 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.590 187212 INFO nova.scheduler.client.report [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Deleted allocations for instance 854e3893-3908-4b4a-b29c-7fb4384e4f0c#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.653 187212 DEBUG oslo_concurrency.lockutils [None req-55935468-5b94-4e1b-9881-32e789b32b6b 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.779s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.910 187212 DEBUG nova.compute.manager [req-92f9aed8-14e0-47f9-abb7-837f5c4d0798 req-f934d61b-3b59-4065-b09b-49d4a682c7d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.910 187212 DEBUG oslo_concurrency.lockutils [req-92f9aed8-14e0-47f9-abb7-837f5c4d0798 req-f934d61b-3b59-4065-b09b-49d4a682c7d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.910 187212 DEBUG oslo_concurrency.lockutils [req-92f9aed8-14e0-47f9-abb7-837f5c4d0798 req-f934d61b-3b59-4065-b09b-49d4a682c7d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.910 187212 DEBUG oslo_concurrency.lockutils [req-92f9aed8-14e0-47f9-abb7-837f5c4d0798 req-f934d61b-3b59-4065-b09b-49d4a682c7d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "854e3893-3908-4b4a-b29c-7fb4384e4f0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.911 187212 DEBUG nova.compute.manager [req-92f9aed8-14e0-47f9-abb7-837f5c4d0798 req-f934d61b-3b59-4065-b09b-49d4a682c7d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] No waiting events found dispatching network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:50 np0005546909 nova_compute[187208]: 2025-12-05 12:09:50.911 187212 WARNING nova.compute.manager [req-92f9aed8-14e0-47f9-abb7-837f5c4d0798 req-f934d61b-3b59-4065-b09b-49d4a682c7d5 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received unexpected event network-vif-plugged-1b4ab157-ddea-449c-ab91-983a53dd2045 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.450 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.451 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.451 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.451 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.452 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.453 187212 INFO nova.compute.manager [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Terminating instance#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.455 187212 DEBUG nova.compute.manager [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:09:51 np0005546909 kernel: tap2e9efd6c-74 (unregistering): left promiscuous mode
Dec  5 07:09:51 np0005546909 NetworkManager[55691]: <info>  [1764936591.4755] device (tap2e9efd6c-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:09:51 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:51Z|00690|binding|INFO|Releasing lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 from this chassis (sb_readonly=0)
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.481 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:51 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:51Z|00691|binding|INFO|Setting lport 2e9efd6c-740c-405b-b9f0-bd46434070a7 down in Southbound
Dec  5 07:09:51 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:51Z|00692|binding|INFO|Removing iface tap2e9efd6c-74 ovn-installed in OVS
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.485 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.499 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.520 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936576.5187266, b235a96f-7a12-4bd2-8627-33b128346aa4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.520 187212 INFO nova.compute.manager [-] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:09:51 np0005546909 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d00000036.scope: Deactivated successfully.
Dec  5 07:09:51 np0005546909 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d00000036.scope: Consumed 13.234s CPU time.
Dec  5 07:09:51 np0005546909 systemd-machined[153543]: Machine qemu-81-instance-00000036 terminated.
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.543 187212 DEBUG nova.network.neutron [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Successfully updated port: 7370bdd5-ddf8-40de-9f35-975b8ceab3ef _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.664 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:5e:ef 10.100.0.5'], port_security=['fa:16:3e:ab:5e:ef 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '24358eea-14fb-4863-a6c4-aadcdb495f54', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58cbd93e463049988ccd6d013893e7d6', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'def128bf-31aa-408f-b463-573b7d555296', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.231', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d153ef6-62be-4b5b-8b0c-2bee0b9184c5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=2e9efd6c-740c-405b-b9f0-bd46434070a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.665 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 2e9efd6c-740c-405b-b9f0-bd46434070a7 in datapath b5c17e5c-2b6c-48d3-9992-ac34070e3363 unbound from our chassis#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.667 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5c17e5c-2b6c-48d3-9992-ac34070e3363, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.669 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f927fa9b-e9b5-45be-b894-cac3295b6d50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.669 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363 namespace which is not needed anymore#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.680 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "refresh_cache-159b5354-c124-484f-a8ec-da1abf719114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.680 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquired lock "refresh_cache-159b5354-c124-484f-a8ec-da1abf719114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.680 187212 DEBUG nova.network.neutron [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.683 187212 DEBUG nova.compute.manager [None req-2e64d549-3601-47d6-b551-e33f37549524 - - - - - -] [instance: b235a96f-7a12-4bd2-8627-33b128346aa4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.718 187212 INFO nova.virt.libvirt.driver [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Instance destroyed successfully.#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.719 187212 DEBUG nova.objects.instance [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lazy-loading 'resources' on Instance uuid 24358eea-14fb-4863-a6c4-aadcdb495f54 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.740 187212 DEBUG nova.virt.libvirt.vif [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1629320086',display_name='tempest-ServerActionsTestOtherB-server-1629320086',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1629320086',id=54,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCLIKVsEL0lmma4upWYe8NiCB7ZJxacCmu4vu1RJu3M5/Fu5S7w/HUSIKvvOTrl/9nUJ4pE5tXIAyPQxQDsptmV5i8IinhFeAgIm0GlEBvfbCuuhpWud8F+u8GsIwgaqpQ==',key_name='tempest-keypair-776546213',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:09:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58cbd93e463049988ccd6d013893e7d6',ramdisk_id='',reservation_id='r-l59qc6ty',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1759520420',owner_user_name='tempest-ServerActionsTestOtherB-1759520420-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:09:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4ad1281afc874c0ca55d908d3a6e05a8',uuid=24358eea-14fb-4863-a6c4-aadcdb495f54,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.741 187212 DEBUG nova.network.os_vif_util [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converting VIF {"id": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "address": "fa:16:3e:ab:5e:ef", "network": {"id": "b5c17e5c-2b6c-48d3-9992-ac34070e3363", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-11848074-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58cbd93e463049988ccd6d013893e7d6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2e9efd6c-74", "ovs_interfaceid": "2e9efd6c-740c-405b-b9f0-bd46434070a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.742 187212 DEBUG nova.network.os_vif_util [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.742 187212 DEBUG os_vif [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.745 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2e9efd6c-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.746 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.749 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.752 187212 INFO os_vif [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:5e:ef,bridge_name='br-int',has_traffic_filtering=True,id=2e9efd6c-740c-405b-b9f0-bd46434070a7,network=Network(b5c17e5c-2b6c-48d3-9992-ac34070e3363),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2e9efd6c-74')#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.752 187212 INFO nova.virt.libvirt.driver [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Deleting instance files /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54_del#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.757 187212 INFO nova.virt.libvirt.driver [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Deletion of /var/lib/nova/instances/24358eea-14fb-4863-a6c4-aadcdb495f54_del complete#033[00m
Dec  5 07:09:51 np0005546909 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [NOTICE]   (225423) : haproxy version is 2.8.14-c23fe91
Dec  5 07:09:51 np0005546909 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [NOTICE]   (225423) : path to executable is /usr/sbin/haproxy
Dec  5 07:09:51 np0005546909 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [WARNING]  (225423) : Exiting Master process...
Dec  5 07:09:51 np0005546909 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [ALERT]    (225423) : Current worker (225425) exited with code 143 (Terminated)
Dec  5 07:09:51 np0005546909 neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363[225417]: [WARNING]  (225423) : All workers exited. Exiting... (0)
Dec  5 07:09:51 np0005546909 systemd[1]: libpod-65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5.scope: Deactivated successfully.
Dec  5 07:09:51 np0005546909 podman[231145]: 2025-12-05 12:09:51.796695637 +0000 UTC m=+0.040533592 container died 65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.806 187212 INFO nova.compute.manager [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.806 187212 DEBUG oslo.service.loopingcall [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.807 187212 DEBUG nova.compute.manager [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.807 187212 DEBUG nova.network.neutron [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:09:51 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5-userdata-shm.mount: Deactivated successfully.
Dec  5 07:09:51 np0005546909 systemd[1]: var-lib-containers-storage-overlay-93f32fdd3ea0552c523abfc1a627c1ddf05c35a6f969e26671c37410720e74dc-merged.mount: Deactivated successfully.
Dec  5 07:09:51 np0005546909 podman[231145]: 2025-12-05 12:09:51.835872399 +0000 UTC m=+0.079710354 container cleanup 65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.837 187212 DEBUG nova.network.neutron [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:09:51 np0005546909 systemd[1]: libpod-conmon-65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5.scope: Deactivated successfully.
Dec  5 07:09:51 np0005546909 podman[231176]: 2025-12-05 12:09:51.898826971 +0000 UTC m=+0.041034386 container remove 65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.904 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[42f7b2b9-c2fe-4a52-9c6e-b56ad3e11bdd]: (4, ('Fri Dec  5 12:09:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363 (65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5)\n65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5\nFri Dec  5 12:09:51 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363 (65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5)\n65d511f8d6be07e1583f4d0f8b73c6d5c09f8fc1e10b24f2274f2319e8ad80e5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.906 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1a0278b9-0aa1-4197-8e71-c74b08279188]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.907 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5c17e5c-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:51 np0005546909 kernel: tapb5c17e5c-20: left promiscuous mode
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.910 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.914 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[06367650-203f-4e3d-be4c-7f621b3afd42]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:51 np0005546909 nova_compute[187208]: 2025-12-05 12:09:51.922 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.942 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8c68ae03-bc83-4718-a6b1-9d6b062fc701]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.943 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7f52005e-f6fb-448a-8b7b-da46d96afbcf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.957 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[62cde3f9-1902-47f9-8397-058446386876]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371894, 'reachable_time': 25036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231189, 'error': None, 'target': 'ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.959 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b5c17e5c-2b6c-48d3-9992-ac34070e3363 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:09:51 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:51.959 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[5e10bdae-c100-46b4-b674-c5aa56dcb910]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:51 np0005546909 systemd[1]: run-netns-ovnmeta\x2db5c17e5c\x2d2b6c\x2d48d3\x2d9992\x2dac34070e3363.mount: Deactivated successfully.
Dec  5 07:09:52 np0005546909 nova_compute[187208]: 2025-12-05 12:09:52.743 187212 DEBUG nova.network.neutron [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Updating instance_info_cache with network_info: [{"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.033 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Releasing lock "refresh_cache-159b5354-c124-484f-a8ec-da1abf719114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.034 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance network_info: |[{"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.037 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Start _get_guest_xml network_info=[{"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.039 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.246 187212 WARNING nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.253 187212 DEBUG nova.virt.libvirt.host [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.254 187212 DEBUG nova.virt.libvirt.host [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.259 187212 DEBUG nova.virt.libvirt.host [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.260 187212 DEBUG nova.virt.libvirt.host [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.260 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.261 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.261 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.262 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.262 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.262 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.262 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.263 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.263 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.263 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.263 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.264 187212 DEBUG nova.virt.hardware [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.269 187212 DEBUG nova.virt.libvirt.vif [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2012489303',display_name='tempest-ServerDiskConfigTestJSON-server-2012489303',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2012489303',id=73,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-flfq46vv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:42Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=159b5354-c124-484f-a8ec-da1abf719114,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.269 187212 DEBUG nova.network.os_vif_util [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.270 187212 DEBUG nova.network.os_vif_util [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.271 187212 DEBUG nova.objects.instance [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'pci_devices' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.340 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  <uuid>159b5354-c124-484f-a8ec-da1abf719114</uuid>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  <name>instance-00000049</name>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-2012489303</nova:name>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:09:53</nova:creationTime>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:        <nova:user uuid="ef254bb2df0442c6bcadfb3a6861c0e9">tempest-ServerDiskConfigTestJSON-1245488084-project-member</nova:user>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:        <nova:project uuid="e836357870d746e49bc783da7cd3accd">tempest-ServerDiskConfigTestJSON-1245488084</nova:project>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:        <nova:port uuid="7370bdd5-ddf8-40de-9f35-975b8ceab3ef">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <entry name="serial">159b5354-c124-484f-a8ec-da1abf719114</entry>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <entry name="uuid">159b5354-c124-484f-a8ec-da1abf719114</entry>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:ee:f0:e8"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <target dev="tap7370bdd5-dd"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/console.log" append="off"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:09:53 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:09:53 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:09:53 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:09:53 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.342 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Preparing to wait for external event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.342 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.343 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.343 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.344 187212 DEBUG nova.virt.libvirt.vif [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2012489303',display_name='tempest-ServerDiskConfigTestJSON-server-2012489303',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2012489303',id=73,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-flfq46vv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:42Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=159b5354-c124-484f-a8ec-da1abf719114,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.344 187212 DEBUG nova.network.os_vif_util [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.344 187212 DEBUG nova.network.os_vif_util [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.345 187212 DEBUG os_vif [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.345 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.346 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.346 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.347 187212 DEBUG nova.network.neutron [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Successfully created port: ef99bad5-d092-46f6-9b3a-8225cc233d1e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.352 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.353 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7370bdd5-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.354 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7370bdd5-dd, col_values=(('external_ids', {'iface-id': '7370bdd5-ddf8-40de-9f35-975b8ceab3ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:f0:e8', 'vm-uuid': '159b5354-c124-484f-a8ec-da1abf719114'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.355 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.357 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:09:53 np0005546909 NetworkManager[55691]: <info>  [1764936593.3574] manager: (tap7370bdd5-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.363 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.365 187212 INFO os_vif [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd')#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.494 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.495 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.495 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No VIF found with MAC fa:16:3e:ee:f0:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.495 187212 INFO nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Using config drive#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.770 187212 INFO nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Creating config drive at /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.777 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwyb6xu_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.905 187212 DEBUG oslo_concurrency.processutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwyb6xu_" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:09:53 np0005546909 kernel: tap7370bdd5-dd: entered promiscuous mode
Dec  5 07:09:53 np0005546909 systemd-udevd[231106]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:09:53 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:53Z|00693|binding|INFO|Claiming lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef for this chassis.
Dec  5 07:09:53 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:53Z|00694|binding|INFO|7370bdd5-ddf8-40de-9f35-975b8ceab3ef: Claiming fa:16:3e:ee:f0:e8 10.100.0.14
Dec  5 07:09:53 np0005546909 NetworkManager[55691]: <info>  [1764936593.9762] manager: (tap7370bdd5-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/269)
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.982 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:f0:e8 10.100.0.14'], port_security=['fa:16:3e:ee:f0:e8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '159b5354-c124-484f-a8ec-da1abf719114', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7370bdd5-ddf8-40de-9f35-975b8ceab3ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.983 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7370bdd5-ddf8-40de-9f35-975b8ceab3ef in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c bound to our chassis#033[00m
Dec  5 07:09:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.985 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c#033[00m
Dec  5 07:09:53 np0005546909 NetworkManager[55691]: <info>  [1764936593.9869] device (tap7370bdd5-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:09:53 np0005546909 NetworkManager[55691]: <info>  [1764936593.9875] device (tap7370bdd5-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:09:53 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:53Z|00695|binding|INFO|Setting lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef ovn-installed in OVS
Dec  5 07:09:53 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:53Z|00696|binding|INFO|Setting lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef up in Southbound
Dec  5 07:09:53 np0005546909 nova_compute[187208]: 2025-12-05 12:09:53.992 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.996 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8116b836-e86e-4cfa-b3d4-556d3f8fa9c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.997 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7be4540a-01 in ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.999 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7be4540a-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:53.999 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[959ca3cf-c740-44cc-9501-cfa2b2239ba6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.001 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0e525708-5174-4bdf-ad88-1cd982ae1edb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.013 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c45ffd77-ae24-4ac7-b22c-b3ad89a505a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 systemd-machined[153543]: New machine qemu-82-instance-00000049.
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.025 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6e3ba3be-f90a-4c5a-a976-1b5fdcdbec45]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 systemd[1]: Started Virtual Machine qemu-82-instance-00000049.
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.051 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b19a87-ca2f-4e14-888f-93e92bec132a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.058 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ac234cd7-1ae5-40cb-b634-a8fa9e4fc52b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 NetworkManager[55691]: <info>  [1764936594.0603] manager: (tap7be4540a-00): new Veth device (/org/freedesktop/NetworkManager/Devices/270)
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.089 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[508362f2-b86c-4a3e-826b-241ee5d99630]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.092 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8983ea-116d-480d-b9ce-ced77d856e51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.093 187212 DEBUG nova.compute.manager [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Received event network-vif-deleted-1b4ab157-ddea-449c-ab91-983a53dd2045 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.093 187212 DEBUG nova.compute.manager [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-changed-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.094 187212 DEBUG nova.compute.manager [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Refreshing instance network info cache due to event network-changed-7370bdd5-ddf8-40de-9f35-975b8ceab3ef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.094 187212 DEBUG oslo_concurrency.lockutils [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-159b5354-c124-484f-a8ec-da1abf719114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.095 187212 DEBUG oslo_concurrency.lockutils [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-159b5354-c124-484f-a8ec-da1abf719114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.095 187212 DEBUG nova.network.neutron [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Refreshing network info cache for port 7370bdd5-ddf8-40de-9f35-975b8ceab3ef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:09:54 np0005546909 NetworkManager[55691]: <info>  [1764936594.1166] device (tap7be4540a-00): carrier: link connected
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.121 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[86b3f2b7-664f-41bd-a858-994fdb71806e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.137 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2c91e24c-8d58-4a7f-a228-90d1a8d6acc4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397468, 'reachable_time': 33202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231240, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.152 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0dd987-04ae-4056-bfce-df4b8e818e16]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:4893'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 397468, 'tstamp': 397468}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231241, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.173 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[50326e5e-13b1-4290-b50a-be90c2dcb8c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397468, 'reachable_time': 33202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231242, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.206 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cfda746a-de5d-44a6-b684-099e29a0dfa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.266 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dbfe1d23-6864-4cb8-b7ea-e3d94111dad7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.268 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.268 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.269 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7be4540a-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:54 np0005546909 kernel: tap7be4540a-00: entered promiscuous mode
Dec  5 07:09:54 np0005546909 NetworkManager[55691]: <info>  [1764936594.2718] manager: (tap7be4540a-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.271 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.274 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7be4540a-00, col_values=(('external_ids', {'iface-id': '4dcf8e96-bf04-4914-959a-aad071dfa454'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:54Z|00697|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.277 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.277 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c50be710-d029-491b-b265-a902d8b48b92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.278 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:09:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:54.281 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'env', 'PROCESS_TAG=haproxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.286 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.513 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936594.5128646, 159b5354-c124-484f-a8ec-da1abf719114 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.514 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] VM Started (Lifecycle Event)#033[00m
Dec  5 07:09:54 np0005546909 podman[231281]: 2025-12-05 12:09:54.654213111 +0000 UTC m=+0.053938176 container create 73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  5 07:09:54 np0005546909 systemd[1]: Started libpod-conmon-73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e.scope.
Dec  5 07:09:54 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:09:54 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db0958bae067600c1586deb306b205beef4d4a15d45a054b88ed994a15bf001d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:09:54 np0005546909 podman[231281]: 2025-12-05 12:09:54.623987265 +0000 UTC m=+0.023712360 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:09:54 np0005546909 podman[231281]: 2025-12-05 12:09:54.738124923 +0000 UTC m=+0.137850018 container init 73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:09:54 np0005546909 podman[231281]: 2025-12-05 12:09:54.744218357 +0000 UTC m=+0.143943422 container start 73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:09:54 np0005546909 podman[231294]: 2025-12-05 12:09:54.758746203 +0000 UTC m=+0.064610040 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  5 07:09:54 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [NOTICE]   (231317) : New worker (231320) forked
Dec  5 07:09:54 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [NOTICE]   (231317) : Loading success.
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.808 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.815 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936594.513273, 159b5354-c124-484f-a8ec-da1abf719114 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.815 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.838 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.841 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:09:54 np0005546909 nova_compute[187208]: 2025-12-05 12:09:54.869 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.204 187212 DEBUG nova.compute.manager [req-c5526603-da02-408b-b9f1-b299d0c9a11d req-d134e885-51cd-4914-adcf-e4f1804c4804 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.205 187212 DEBUG oslo_concurrency.lockutils [req-c5526603-da02-408b-b9f1-b299d0c9a11d req-d134e885-51cd-4914-adcf-e4f1804c4804 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.205 187212 DEBUG oslo_concurrency.lockutils [req-c5526603-da02-408b-b9f1-b299d0c9a11d req-d134e885-51cd-4914-adcf-e4f1804c4804 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.205 187212 DEBUG oslo_concurrency.lockutils [req-c5526603-da02-408b-b9f1-b299d0c9a11d req-d134e885-51cd-4914-adcf-e4f1804c4804 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.206 187212 DEBUG nova.compute.manager [req-c5526603-da02-408b-b9f1-b299d0c9a11d req-d134e885-51cd-4914-adcf-e4f1804c4804 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Processing event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.206 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.210 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936595.209823, 159b5354-c124-484f-a8ec-da1abf719114 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.210 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.212 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.216 187212 INFO nova.virt.libvirt.driver [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance spawned successfully.#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.216 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.248 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.256 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.260 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.261 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.261 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.262 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.262 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.263 187212 DEBUG nova.virt.libvirt.driver [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.295 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.336 187212 INFO nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Took 13.13 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.337 187212 DEBUG nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.413 187212 INFO nova.compute.manager [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Took 13.84 seconds to build instance.#033[00m
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.437 187212 DEBUG oslo_concurrency.lockutils [None req-f8e3e499-44d9-4f3c-8544-959fb37bed53 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.924s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:55 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:55Z|00698|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec  5 07:09:55 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:55Z|00699|binding|INFO|Releasing lport da9adcd8-f2a5-4ff7-962a-717d700ad7b5 from this chassis (sb_readonly=0)
Dec  5 07:09:55 np0005546909 nova_compute[187208]: 2025-12-05 12:09:55.835 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.006 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936581.005481, 2e537618-f998-4c4d-8e1e-e9cc79219330 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.007 187212 INFO nova.compute.manager [-] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.030 187212 DEBUG nova.network.neutron [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.032 187212 DEBUG nova.compute.manager [None req-d8000d96-c9e5-4bb5-a562-40943c9b246c - - - - - -] [instance: 2e537618-f998-4c4d-8e1e-e9cc79219330] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.060 187212 INFO nova.compute.manager [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Took 4.25 seconds to deallocate network for instance.#033[00m
Dec  5 07:09:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:56Z|00700|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec  5 07:09:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:56Z|00701|binding|INFO|Releasing lport da9adcd8-f2a5-4ff7-962a-717d700ad7b5 from this chassis (sb_readonly=0)
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.086 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.149 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.150 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.241 187212 DEBUG nova.compute.provider_tree [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.262 187212 DEBUG nova.scheduler.client.report [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.322 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.354 187212 INFO nova.scheduler.client.report [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Deleted allocations for instance 24358eea-14fb-4863-a6c4-aadcdb495f54#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.527 187212 DEBUG oslo_concurrency.lockutils [None req-e6dc95a5-08e7-4aa8-b4d5-ca2f1a372cfe 4ad1281afc874c0ca55d908d3a6e05a8 58cbd93e463049988ccd6d013893e7d6 - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.794 187212 DEBUG nova.compute.manager [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.795 187212 DEBUG oslo_concurrency.lockutils [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.795 187212 DEBUG oslo_concurrency.lockutils [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.796 187212 DEBUG oslo_concurrency.lockutils [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.796 187212 DEBUG nova.compute.manager [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.796 187212 WARNING nova.compute.manager [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received unexpected event network-vif-plugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:09:56 np0005546909 nova_compute[187208]: 2025-12-05 12:09:56.797 187212 DEBUG nova.compute.manager [req-fb4bf697-4411-4adb-9e0d-b3e637d3eaf0 req-78ae3f61-feaa-42b9-86e4-02be0f3e6421 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-deleted-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:57 np0005546909 nova_compute[187208]: 2025-12-05 12:09:57.885 187212 DEBUG nova.compute.manager [req-523f4e09-6a08-43ee-b65c-9323659bc3a7 req-cf5dcf14-2c0f-4189-8fa9-0c421b28f6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:57 np0005546909 nova_compute[187208]: 2025-12-05 12:09:57.886 187212 DEBUG oslo_concurrency.lockutils [req-523f4e09-6a08-43ee-b65c-9323659bc3a7 req-cf5dcf14-2c0f-4189-8fa9-0c421b28f6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:57 np0005546909 nova_compute[187208]: 2025-12-05 12:09:57.887 187212 DEBUG oslo_concurrency.lockutils [req-523f4e09-6a08-43ee-b65c-9323659bc3a7 req-cf5dcf14-2c0f-4189-8fa9-0c421b28f6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:57 np0005546909 nova_compute[187208]: 2025-12-05 12:09:57.887 187212 DEBUG oslo_concurrency.lockutils [req-523f4e09-6a08-43ee-b65c-9323659bc3a7 req-cf5dcf14-2c0f-4189-8fa9-0c421b28f6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:57 np0005546909 nova_compute[187208]: 2025-12-05 12:09:57.887 187212 DEBUG nova.compute.manager [req-523f4e09-6a08-43ee-b65c-9323659bc3a7 req-cf5dcf14-2c0f-4189-8fa9-0c421b28f6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] No waiting events found dispatching network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:57 np0005546909 nova_compute[187208]: 2025-12-05 12:09:57.888 187212 WARNING nova.compute.manager [req-523f4e09-6a08-43ee-b65c-9323659bc3a7 req-cf5dcf14-2c0f-4189-8fa9-0c421b28f6d8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received unexpected event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef for instance with vm_state active and task_state None.#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.023 187212 DEBUG nova.network.neutron [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Successfully updated port: ef99bad5-d092-46f6-9b3a-8225cc233d1e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.033 187212 DEBUG nova.network.neutron [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Updated VIF entry in instance network info cache for port 7370bdd5-ddf8-40de-9f35-975b8ceab3ef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.034 187212 DEBUG nova.network.neutron [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Updating instance_info_cache with network_info: [{"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.092 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.093 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.093 187212 DEBUG nova.network.neutron [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.094 187212 DEBUG oslo_concurrency.lockutils [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-159b5354-c124-484f-a8ec-da1abf719114" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.094 187212 DEBUG nova.compute.manager [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.094 187212 DEBUG oslo_concurrency.lockutils [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.095 187212 DEBUG oslo_concurrency.lockutils [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.095 187212 DEBUG oslo_concurrency.lockutils [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "24358eea-14fb-4863-a6c4-aadcdb495f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.095 187212 DEBUG nova.compute.manager [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] No waiting events found dispatching network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.095 187212 DEBUG nova.compute.manager [req-f8457037-65f6-40fb-8b19-74e5ca1823b7 req-c6624435-12a2-442a-a258-3b464464b0df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Received event network-vif-unplugged-2e9efd6c-740c-405b-b9f0-bd46434070a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.096 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.096 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.096 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.097 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.097 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.098 187212 INFO nova.compute.manager [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Terminating instance#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.099 187212 DEBUG nova.compute.manager [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.100 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:58 np0005546909 kernel: tapcf99cdda-70 (unregistering): left promiscuous mode
Dec  5 07:09:58 np0005546909 NetworkManager[55691]: <info>  [1764936598.1333] device (tapcf99cdda-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:09:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:58Z|00702|binding|INFO|Releasing lport cf99cdda-7071-4c18-8462-3a556234d81d from this chassis (sb_readonly=0)
Dec  5 07:09:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:58Z|00703|binding|INFO|Setting lport cf99cdda-7071-4c18-8462-3a556234d81d down in Southbound
Dec  5 07:09:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:09:58Z|00704|binding|INFO|Removing iface tapcf99cdda-70 ovn-installed in OVS
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.142 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.145 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.152 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:68:ad 10.100.0.4'], port_security=['fa:16:3e:60:68:ad 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'dbbad270-1e3c-41e1-9173-c1b9df0ab2dd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1bdbd9c8684c4b9b97e00725e41037eb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d91f504-323f-40f6-96ee-8e841aa785bd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd596033-693a-40ca-949c-841d866181bd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=cf99cdda-7071-4c18-8462-3a556234d81d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.154 104471 INFO neutron.agent.ovn.metadata.agent [-] Port cf99cdda-7071-4c18-8462-3a556234d81d in datapath d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 unbound from our chassis#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.156 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.157 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[74a3c9bf-d58a-4905-988d-a2d9c80e178f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.158 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 namespace which is not needed anymore#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.159 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:58 np0005546909 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d00000048.scope: Deactivated successfully.
Dec  5 07:09:58 np0005546909 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d00000048.scope: Consumed 13.691s CPU time.
Dec  5 07:09:58 np0005546909 systemd-machined[153543]: Machine qemu-80-instance-00000048 terminated.
Dec  5 07:09:58 np0005546909 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [NOTICE]   (230487) : haproxy version is 2.8.14-c23fe91
Dec  5 07:09:58 np0005546909 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [NOTICE]   (230487) : path to executable is /usr/sbin/haproxy
Dec  5 07:09:58 np0005546909 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [WARNING]  (230487) : Exiting Master process...
Dec  5 07:09:58 np0005546909 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [ALERT]    (230487) : Current worker (230489) exited with code 143 (Terminated)
Dec  5 07:09:58 np0005546909 neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22[230483]: [WARNING]  (230487) : All workers exited. Exiting... (0)
Dec  5 07:09:58 np0005546909 systemd[1]: libpod-34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85.scope: Deactivated successfully.
Dec  5 07:09:58 np0005546909 podman[231354]: 2025-12-05 12:09:58.30059132 +0000 UTC m=+0.047091409 container died 34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:09:58 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85-userdata-shm.mount: Deactivated successfully.
Dec  5 07:09:58 np0005546909 systemd[1]: var-lib-containers-storage-overlay-58192d43ddea201cbf99dd4a079d9f14871a3e9699282f5a030bf56f666b8ee1-merged.mount: Deactivated successfully.
Dec  5 07:09:58 np0005546909 podman[231354]: 2025-12-05 12:09:58.348033148 +0000 UTC m=+0.094533247 container cleanup 34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:58 np0005546909 systemd[1]: libpod-conmon-34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85.scope: Deactivated successfully.
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.376 187212 INFO nova.virt.libvirt.driver [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Instance destroyed successfully.#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.376 187212 DEBUG nova.objects.instance [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lazy-loading 'resources' on Instance uuid dbbad270-1e3c-41e1-9173-c1b9df0ab2dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.382 187212 DEBUG nova.network.neutron [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.396 187212 DEBUG nova.virt.libvirt.vif [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:09:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-1390207148',display_name='tempest-ServerMetadataTestJSON-server-1390207148',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-1390207148',id=72,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:09:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={key1='alt1',key2='value2',key3='value3'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1bdbd9c8684c4b9b97e00725e41037eb',ramdisk_id='',reservation_id='r-8eo91pmy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataTestJSON-355236921',owner_user_name='tempest-ServerMetadataTestJSON-355236921-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:09:56Z,user_data=None,user_id='4f805540d6084f53aa7bd5a66912be58',uuid=dbbad270-1e3c-41e1-9173-c1b9df0ab2dd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.397 187212 DEBUG nova.network.os_vif_util [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Converting VIF {"id": "cf99cdda-7071-4c18-8462-3a556234d81d", "address": "fa:16:3e:60:68:ad", "network": {"id": "d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1005161499-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1bdbd9c8684c4b9b97e00725e41037eb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcf99cdda-70", "ovs_interfaceid": "cf99cdda-7071-4c18-8462-3a556234d81d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.397 187212 DEBUG nova.network.os_vif_util [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.398 187212 DEBUG os_vif [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.399 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.400 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcf99cdda-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.402 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.403 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.406 187212 INFO os_vif [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:60:68:ad,bridge_name='br-int',has_traffic_filtering=True,id=cf99cdda-7071-4c18-8462-3a556234d81d,network=Network(d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcf99cdda-70')#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.407 187212 INFO nova.virt.libvirt.driver [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Deleting instance files /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd_del#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.407 187212 INFO nova.virt.libvirt.driver [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Deletion of /var/lib/nova/instances/dbbad270-1e3c-41e1-9173-c1b9df0ab2dd_del complete#033[00m
Dec  5 07:09:58 np0005546909 podman[231397]: 2025-12-05 12:09:58.429413618 +0000 UTC m=+0.055497640 container remove 34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.437 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[55a9a081-d4de-49c1-a9fb-bda1696a5bcd]: (4, ('Fri Dec  5 12:09:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 (34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85)\n34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85\nFri Dec  5 12:09:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 (34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85)\n34e59dbb8912cf418fcb640ffb2d59c4b1c46834d9f014bfc19e8a72a3ee6d85\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.440 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8080d015-ebf8-48cc-9b03-6202daec3de9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.442 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5794fbb-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.444 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:58 np0005546909 kernel: tapd5794fbb-c0: left promiscuous mode
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.446 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.450 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[721e1f85-e0ea-47ad-95af-f51ebd18c8a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.465 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.472 187212 INFO nova.compute.manager [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.473 187212 DEBUG oslo.service.loopingcall [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.473 187212 DEBUG nova.compute.manager [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.473 187212 DEBUG nova.network.neutron [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.480 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6e7fe62f-7027-4722-8d48-4b72fcaae141]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.482 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5d20a757-6e25-43a0-9527-14f2d6c38fa7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.499 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[38b653d2-fcd8-4b19-8c3d-f99028b09280]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 393233, 'reachable_time': 18313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231411, 'error': None, 'target': 'ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.502 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d5794fbb-c1d5-48a4-95d7-a9b4ae1fcb22 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:09:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:09:58.503 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[2e6f8025-f346-4518-ad35-d170cb5a4e87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:09:58 np0005546909 systemd[1]: run-netns-ovnmeta\x2dd5794fbb\x2dc1d5\x2d48a4\x2d95d7\x2da9b4ae1fcb22.mount: Deactivated successfully.
Dec  5 07:09:58 np0005546909 nova_compute[187208]: 2025-12-05 12:09:58.899 187212 INFO nova.compute.manager [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Rebuilding instance#033[00m
Dec  5 07:09:59 np0005546909 nova_compute[187208]: 2025-12-05 12:09:59.167 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'trusted_certs' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:59 np0005546909 nova_compute[187208]: 2025-12-05 12:09:59.434 187212 DEBUG nova.compute.manager [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:09:59 np0005546909 nova_compute[187208]: 2025-12-05 12:09:59.508 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'pci_requests' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:59 np0005546909 nova_compute[187208]: 2025-12-05 12:09:59.522 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'pci_devices' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:59 np0005546909 nova_compute[187208]: 2025-12-05 12:09:59.539 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'resources' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:59 np0005546909 nova_compute[187208]: 2025-12-05 12:09:59.551 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'migration_context' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:09:59 np0005546909 nova_compute[187208]: 2025-12-05 12:09:59.566 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:09:59 np0005546909 nova_compute[187208]: 2025-12-05 12:09:59.570 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.144 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936585.1436546, 854e3893-3908-4b4a-b29c-7fb4384e4f0c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.146 187212 INFO nova.compute.manager [-] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.178 187212 DEBUG nova.compute.manager [None req-019c1553-3047-4daf-b809-c66895b903e1 - - - - - -] [instance: 854e3893-3908-4b4a-b29c-7fb4384e4f0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.193 187212 DEBUG nova.compute.manager [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.194 187212 DEBUG nova.compute.manager [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing instance network info cache due to event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.194 187212 DEBUG oslo_concurrency.lockutils [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.657 187212 DEBUG nova.network.neutron [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.683 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.683 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Instance network_info: |[{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.684 187212 DEBUG oslo_concurrency.lockutils [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.684 187212 DEBUG nova.network.neutron [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.688 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Start _get_guest_xml network_info=[{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.694 187212 WARNING nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.705 187212 DEBUG nova.virt.libvirt.host [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.706 187212 DEBUG nova.virt.libvirt.host [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.715 187212 DEBUG nova.virt.libvirt.host [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.717 187212 DEBUG nova.virt.libvirt.host [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.717 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.717 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.718 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.718 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.719 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.719 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.719 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.720 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.720 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.721 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.721 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.721 187212 DEBUG nova.virt.hardware [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.726 187212 DEBUG nova.virt.libvirt.vif [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-569275018',display_name='tempest-tempest.common.compute-instance-569275018',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-569275018',id=74,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-h1nkz7rb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=54d9605a-998b-4492-afc8-f7a5b0dd4e84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.726 187212 DEBUG nova.network.os_vif_util [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.727 187212 DEBUG nova.network.os_vif_util [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:e5:94,bridge_name='br-int',has_traffic_filtering=True,id=ef99bad5-d092-46f6-9b3a-8225cc233d1e,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99bad5-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.729 187212 DEBUG nova.objects.instance [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_devices' on Instance uuid 54d9605a-998b-4492-afc8-f7a5b0dd4e84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.746 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  <uuid>54d9605a-998b-4492-afc8-f7a5b0dd4e84</uuid>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  <name>instance-0000004a</name>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <nova:name>tempest-tempest.common.compute-instance-569275018</nova:name>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:10:00</nova:creationTime>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:        <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:        <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:        <nova:port uuid="ef99bad5-d092-46f6-9b3a-8225cc233d1e">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <entry name="serial">54d9605a-998b-4492-afc8-f7a5b0dd4e84</entry>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <entry name="uuid">54d9605a-998b-4492-afc8-f7a5b0dd4e84</entry>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.config"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:bd:e5:94"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <target dev="tapef99bad5-d0"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/console.log" append="off"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:10:00 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:10:00 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:10:00 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:10:00 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.752 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Preparing to wait for external event network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.753 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.753 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.753 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.754 187212 DEBUG nova.virt.libvirt.vif [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:09:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-569275018',display_name='tempest-tempest.common.compute-instance-569275018',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-569275018',id=74,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-h1nkz7rb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=54d9605a-998b-4492-afc8-f7a5b0dd4e84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.755 187212 DEBUG nova.network.os_vif_util [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.755 187212 DEBUG nova.network.os_vif_util [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:e5:94,bridge_name='br-int',has_traffic_filtering=True,id=ef99bad5-d092-46f6-9b3a-8225cc233d1e,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99bad5-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.756 187212 DEBUG os_vif [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:e5:94,bridge_name='br-int',has_traffic_filtering=True,id=ef99bad5-d092-46f6-9b3a-8225cc233d1e,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99bad5-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.756 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.757 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.757 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.762 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.763 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef99bad5-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.764 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapef99bad5-d0, col_values=(('external_ids', {'iface-id': 'ef99bad5-d092-46f6-9b3a-8225cc233d1e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:e5:94', 'vm-uuid': '54d9605a-998b-4492-afc8-f7a5b0dd4e84'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.765 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.767 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:10:00 np0005546909 NetworkManager[55691]: <info>  [1764936600.7671] manager: (tapef99bad5-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/272)
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.772 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.773 187212 INFO os_vif [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:e5:94,bridge_name='br-int',has_traffic_filtering=True,id=ef99bad5-d092-46f6-9b3a-8225cc233d1e,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapef99bad5-d0')#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.777 187212 DEBUG nova.compute.manager [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-vif-unplugged-cf99cdda-7071-4c18-8462-3a556234d81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.777 187212 DEBUG oslo_concurrency.lockutils [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.777 187212 DEBUG oslo_concurrency.lockutils [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.777 187212 DEBUG oslo_concurrency.lockutils [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.778 187212 DEBUG nova.compute.manager [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] No waiting events found dispatching network-vif-unplugged-cf99cdda-7071-4c18-8462-3a556234d81d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.778 187212 DEBUG nova.compute.manager [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-vif-unplugged-cf99cdda-7071-4c18-8462-3a556234d81d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.778 187212 DEBUG nova.compute.manager [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.779 187212 DEBUG oslo_concurrency.lockutils [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.779 187212 DEBUG oslo_concurrency.lockutils [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.779 187212 DEBUG oslo_concurrency.lockutils [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.779 187212 DEBUG nova.compute.manager [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] No waiting events found dispatching network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.780 187212 WARNING nova.compute.manager [req-892da245-0b1e-48ff-b081-15279048468b req-a87873fb-fdce-4bfe-a0cb-becde8ff48dc 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received unexpected event network-vif-plugged-cf99cdda-7071-4c18-8462-3a556234d81d for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.781 187212 DEBUG nova.network.neutron [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.800 187212 INFO nova.compute.manager [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Took 2.33 seconds to deallocate network for instance.#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.852 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.853 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.853 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:bd:e5:94, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.853 187212 INFO nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Using config drive#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.856 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.856 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.948 187212 DEBUG nova.compute.provider_tree [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:10:00 np0005546909 nova_compute[187208]: 2025-12-05 12:10:00.963 187212 DEBUG nova.scheduler.client.report [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:10:01 np0005546909 nova_compute[187208]: 2025-12-05 12:10:01.024 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:01 np0005546909 nova_compute[187208]: 2025-12-05 12:10:01.057 187212 INFO nova.scheduler.client.report [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Deleted allocations for instance dbbad270-1e3c-41e1-9173-c1b9df0ab2dd#033[00m
Dec  5 07:10:01 np0005546909 nova_compute[187208]: 2025-12-05 12:10:01.147 187212 DEBUG oslo_concurrency.lockutils [None req-652ce5c1-130e-4597-8193-a663ccf337eb 4f805540d6084f53aa7bd5a66912be58 1bdbd9c8684c4b9b97e00725e41037eb - - default default] Lock "dbbad270-1e3c-41e1-9173-c1b9df0ab2dd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.047 187212 INFO nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Creating config drive at /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.config#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.053 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpolobh4qj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.183 187212 DEBUG oslo_concurrency.processutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpolobh4qj" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:02 np0005546909 kernel: tapef99bad5-d0: entered promiscuous mode
Dec  5 07:10:02 np0005546909 NetworkManager[55691]: <info>  [1764936602.2676] manager: (tapef99bad5-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/273)
Dec  5 07:10:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:02Z|00705|binding|INFO|Claiming lport ef99bad5-d092-46f6-9b3a-8225cc233d1e for this chassis.
Dec  5 07:10:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:02Z|00706|binding|INFO|ef99bad5-d092-46f6-9b3a-8225cc233d1e: Claiming fa:16:3e:bd:e5:94 10.100.0.6
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.269 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.328 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:02Z|00707|binding|INFO|Setting lport ef99bad5-d092-46f6-9b3a-8225cc233d1e ovn-installed in OVS
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.334 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:02 np0005546909 systemd-udevd[231456]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:10:02 np0005546909 systemd-machined[153543]: New machine qemu-83-instance-0000004a.
Dec  5 07:10:02 np0005546909 podman[231425]: 2025-12-05 12:10:02.348091084 +0000 UTC m=+0.084528191 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:10:02 np0005546909 systemd[1]: Started Virtual Machine qemu-83-instance-0000004a.
Dec  5 07:10:02 np0005546909 NetworkManager[55691]: <info>  [1764936602.3530] device (tapef99bad5-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:10:02 np0005546909 NetworkManager[55691]: <info>  [1764936602.3540] device (tapef99bad5-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:10:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:02Z|00708|binding|INFO|Setting lport ef99bad5-d092-46f6-9b3a-8225cc233d1e up in Southbound
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.397 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:e5:94 10.100.0.6'], port_security=['fa:16:3e:bd:e5:94 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cbdd2780-9e2b-4e10-8d0a-98de936cf6ae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ef99bad5-d092-46f6-9b3a-8225cc233d1e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.399 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ef99bad5-d092-46f6-9b3a-8225cc233d1e in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis#033[00m
Dec  5 07:10:02 np0005546909 podman[231439]: 2025-12-05 12:10:02.401140163 +0000 UTC m=+0.086243030 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.401 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.415 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cab6b481-f77e-4cc5-b7c9-9dec533e66aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.416 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbfed6fc-31 in ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.419 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbfed6fc-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.419 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[277dc790-3142-4369-9fbf-f0fe5fd40b2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.421 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[04d8195f-6519-4ef6-afd8-51e052390b22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.441 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[f995f981-a665-42e5-9f00-4e51572ab1a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.460 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6259a408-5f98-44bd-b7d5-e8fded20e679]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.498 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5482da6c-3a4c-4be2-8550-f57d66f6318e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.504 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3f7fca83-1a73-4c88-b8fe-67388e8a3d3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 NetworkManager[55691]: <info>  [1764936602.5056] manager: (tapfbfed6fc-30): new Veth device (/org/freedesktop/NetworkManager/Devices/274)
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.539 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed7bde6-3b8e-4a7d-8611-cb148ab6e2a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.544 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[29bf5df2-b8ee-49b6-870c-0c6129ae2b5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 NetworkManager[55691]: <info>  [1764936602.5695] device (tapfbfed6fc-30): carrier: link connected
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.579 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[274bb502-10ee-4f3c-aa22-741f8176763c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.600 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9e54ac04-4f9b-4502-88d3-816c49f09155]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398313, 'reachable_time': 28299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231502, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.616 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[72505309-d959-4fbc-9cc4-e57ed9d560d6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe26:8872'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398313, 'tstamp': 398313}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231503, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.620 187212 DEBUG nova.compute.manager [req-f721178f-5b07-44d5-8279-66ccdf46626b req-d59bc667-927d-4f54-ab01-7a9457e89e46 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Received event network-vif-deleted-cf99cdda-7071-4c18-8462-3a556234d81d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.634 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b774b591-fba1-45d0-b8f5-90d6f51dcd14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398313, 'reachable_time': 28299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231504, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.673 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0b202cb9-bfe3-4dd3-a95b-d65bf5dcd84c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.744 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8858b129-f3ec-4d0f-8e41-788a5c95659d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.748 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.749 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.750 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.752 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:02 np0005546909 NetworkManager[55691]: <info>  [1764936602.7536] manager: (tapfbfed6fc-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/275)
Dec  5 07:10:02 np0005546909 kernel: tapfbfed6fc-30: entered promiscuous mode
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.756 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.757 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.759 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:02Z|00709|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.776 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.777 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.779 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a2131f08-9763-49f1-b735-76d4f9f22b6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.780 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.pid.haproxy
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID fbfed6fc-3701-4311-a4c2-8c49c5b7584c
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:10:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:02.781 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'env', 'PROCESS_TAG=haproxy-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbfed6fc-3701-4311-a4c2-8c49c5b7584c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.823 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936602.8233302, 54d9605a-998b-4492-afc8-f7a5b0dd4e84 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.824 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] VM Started (Lifecycle Event)#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.850 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.861 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936602.8235443, 54d9605a-998b-4492-afc8-f7a5b0dd4e84 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.862 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.883 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.888 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:10:02 np0005546909 nova_compute[187208]: 2025-12-05 12:10:02.914 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:10:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:03.016 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:03.017 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:03.018 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:03 np0005546909 nova_compute[187208]: 2025-12-05 12:10:03.092 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:03 np0005546909 podman[231543]: 2025-12-05 12:10:03.184249085 +0000 UTC m=+0.049928621 container create 2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec  5 07:10:03 np0005546909 systemd[1]: Started libpod-conmon-2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0.scope.
Dec  5 07:10:03 np0005546909 podman[231543]: 2025-12-05 12:10:03.159572268 +0000 UTC m=+0.025251824 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:10:03 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:10:03 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0abe6a70244aa25519de7b565800e0d75c41ac2b2899ca45664096e7b4998d70/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:10:03 np0005546909 podman[231543]: 2025-12-05 12:10:03.278350939 +0000 UTC m=+0.144030495 container init 2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:10:03 np0005546909 podman[231543]: 2025-12-05 12:10:03.284905356 +0000 UTC m=+0.150584892 container start 2a76df6f8bc7fc7960fb8fef1df1b8162a596895270fe62ed74babf20c79a1f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:10:03 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[231559]: [NOTICE]   (231563) : New worker (231565) forked
Dec  5 07:10:03 np0005546909 neutron-haproxy-ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c[231559]: [NOTICE]   (231563) : Loading success.
Dec  5 07:10:03 np0005546909 nova_compute[187208]: 2025-12-05 12:10:03.412 187212 DEBUG nova.network.neutron [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updated VIF entry in instance network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:10:03 np0005546909 nova_compute[187208]: 2025-12-05 12:10:03.413 187212 DEBUG nova.network.neutron [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:10:03 np0005546909 nova_compute[187208]: 2025-12-05 12:10:03.453 187212 DEBUG oslo_concurrency.lockutils [req-f8d7c39a-3b98-451d-870b-d4e0e8ed64d5 req-64ab63a9-b403-4451-b63e-7f21ee9ece39 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.913 187212 DEBUG nova.compute.manager [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.913 187212 DEBUG oslo_concurrency.lockutils [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.914 187212 DEBUG oslo_concurrency.lockutils [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.914 187212 DEBUG oslo_concurrency.lockutils [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.914 187212 DEBUG nova.compute.manager [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Processing event network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.914 187212 DEBUG nova.compute.manager [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.914 187212 DEBUG oslo_concurrency.lockutils [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.914 187212 DEBUG oslo_concurrency.lockutils [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.915 187212 DEBUG oslo_concurrency.lockutils [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.915 187212 DEBUG nova.compute.manager [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] No waiting events found dispatching network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.915 187212 WARNING nova.compute.manager [req-ebf98251-d6e6-4f88-818a-2643f45010f3 req-6c9dc2d9-3b87-4bb9-b325-5d53988d6064 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received unexpected event network-vif-plugged-ef99bad5-d092-46f6-9b3a-8225cc233d1e for instance with vm_state building and task_state spawning.#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.915 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.919 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936604.9192653, 54d9605a-998b-4492-afc8-f7a5b0dd4e84 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.919 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.922 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.927 187212 INFO nova.virt.libvirt.driver [-] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Instance spawned successfully.#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.928 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.964 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.971 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.972 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.972 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.973 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.973 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.974 187212 DEBUG nova.virt.libvirt.driver [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:04 np0005546909 nova_compute[187208]: 2025-12-05 12:10:04.982 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:10:05 np0005546909 nova_compute[187208]: 2025-12-05 12:10:05.262 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:10:05 np0005546909 nova_compute[187208]: 2025-12-05 12:10:05.283 187212 INFO nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Took 16.76 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:10:05 np0005546909 nova_compute[187208]: 2025-12-05 12:10:05.284 187212 DEBUG nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:05 np0005546909 nova_compute[187208]: 2025-12-05 12:10:05.366 187212 INFO nova.compute.manager [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Took 18.28 seconds to build instance.#033[00m
Dec  5 07:10:05 np0005546909 nova_compute[187208]: 2025-12-05 12:10:05.387 187212 DEBUG oslo_concurrency.lockutils [None req-3c881e6c-cdf0-43f6-8f08-4627bc5cabed 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "54d9605a-998b-4492-afc8-f7a5b0dd4e84" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:05 np0005546909 nova_compute[187208]: 2025-12-05 12:10:05.767 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:06 np0005546909 nova_compute[187208]: 2025-12-05 12:10:06.716 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936591.7157092, 24358eea-14fb-4863-a6c4-aadcdb495f54 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:06 np0005546909 nova_compute[187208]: 2025-12-05 12:10:06.717 187212 INFO nova.compute.manager [-] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:10:07 np0005546909 nova_compute[187208]: 2025-12-05 12:10:07.013 187212 DEBUG nova.compute.manager [None req-b6d856f4-2460-466b-bf95-245f12e859c0 - - - - - -] [instance: 24358eea-14fb-4863-a6c4-aadcdb495f54] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:07 np0005546909 podman[231589]: 2025-12-05 12:10:07.213221198 +0000 UTC m=+0.058029553 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:10:07 np0005546909 podman[231590]: 2025-12-05 12:10:07.245819541 +0000 UTC m=+0.090688987 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  5 07:10:08 np0005546909 nova_compute[187208]: 2025-12-05 12:10:08.094 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:08 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:08Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ee:f0:e8 10.100.0.14
Dec  5 07:10:08 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:08Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ee:f0:e8 10.100.0.14
Dec  5 07:10:09 np0005546909 nova_compute[187208]: 2025-12-05 12:10:09.617 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Dec  5 07:10:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:10Z|00710|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec  5 07:10:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:10Z|00711|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec  5 07:10:10 np0005546909 nova_compute[187208]: 2025-12-05 12:10:10.388 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:10Z|00712|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec  5 07:10:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:10Z|00713|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec  5 07:10:10 np0005546909 nova_compute[187208]: 2025-12-05 12:10:10.765 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:10 np0005546909 NetworkManager[55691]: <info>  [1764936610.7661] manager: (patch-br-int-to-provnet-4d379fb6-127b-4441-995d-a70eac7d372c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/276)
Dec  5 07:10:10 np0005546909 NetworkManager[55691]: <info>  [1764936610.7676] manager: (patch-provnet-4d379fb6-127b-4441-995d-a70eac7d372c-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Dec  5 07:10:10 np0005546909 nova_compute[187208]: 2025-12-05 12:10:10.769 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:10Z|00714|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec  5 07:10:10 np0005546909 nova_compute[187208]: 2025-12-05 12:10:10.798 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:10Z|00715|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec  5 07:10:10 np0005546909 nova_compute[187208]: 2025-12-05 12:10:10.809 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:11 np0005546909 podman[231637]: 2025-12-05 12:10:11.230952739 +0000 UTC m=+0.073954888 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 07:10:11 np0005546909 nova_compute[187208]: 2025-12-05 12:10:11.537 187212 DEBUG nova.compute.manager [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Received event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:11 np0005546909 nova_compute[187208]: 2025-12-05 12:10:11.537 187212 DEBUG nova.compute.manager [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing instance network info cache due to event network-changed-ef99bad5-d092-46f6-9b3a-8225cc233d1e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:10:11 np0005546909 nova_compute[187208]: 2025-12-05 12:10:11.538 187212 DEBUG oslo_concurrency.lockutils [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:10:11 np0005546909 nova_compute[187208]: 2025-12-05 12:10:11.538 187212 DEBUG oslo_concurrency.lockutils [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:10:11 np0005546909 nova_compute[187208]: 2025-12-05 12:10:11.539 187212 DEBUG nova.network.neutron [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Refreshing network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:10:11 np0005546909 kernel: tap7370bdd5-dd (unregistering): left promiscuous mode
Dec  5 07:10:11 np0005546909 NetworkManager[55691]: <info>  [1764936611.7987] device (tap7370bdd5-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:10:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:11Z|00716|binding|INFO|Releasing lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef from this chassis (sb_readonly=0)
Dec  5 07:10:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:11Z|00717|binding|INFO|Setting lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef down in Southbound
Dec  5 07:10:11 np0005546909 nova_compute[187208]: 2025-12-05 12:10:11.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:11 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:11Z|00718|binding|INFO|Removing iface tap7370bdd5-dd ovn-installed in OVS
Dec  5 07:10:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:11.817 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:f0:e8 10.100.0.14'], port_security=['fa:16:3e:ee:f0:e8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '159b5354-c124-484f-a8ec-da1abf719114', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7370bdd5-ddf8-40de-9f35-975b8ceab3ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:10:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:11.818 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7370bdd5-ddf8-40de-9f35-975b8ceab3ef in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c unbound from our chassis#033[00m
Dec  5 07:10:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:11.819 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:10:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:11.820 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d871d2d-9bb8-4988-a853-19aa53a16383]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:11 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:11.821 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace which is not needed anymore#033[00m
Dec  5 07:10:11 np0005546909 nova_compute[187208]: 2025-12-05 12:10:11.834 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:11 np0005546909 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d00000049.scope: Deactivated successfully.
Dec  5 07:10:11 np0005546909 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d00000049.scope: Consumed 12.620s CPU time.
Dec  5 07:10:11 np0005546909 systemd-machined[153543]: Machine qemu-82-instance-00000049 terminated.
Dec  5 07:10:11 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [NOTICE]   (231317) : haproxy version is 2.8.14-c23fe91
Dec  5 07:10:11 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [NOTICE]   (231317) : path to executable is /usr/sbin/haproxy
Dec  5 07:10:11 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [WARNING]  (231317) : Exiting Master process...
Dec  5 07:10:11 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [ALERT]    (231317) : Current worker (231320) exited with code 143 (Terminated)
Dec  5 07:10:11 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231297]: [WARNING]  (231317) : All workers exited. Exiting... (0)
Dec  5 07:10:11 np0005546909 systemd[1]: libpod-73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e.scope: Deactivated successfully.
Dec  5 07:10:11 np0005546909 conmon[231297]: conmon 73d6e318f6f61d863eb4 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e.scope/container/memory.events
Dec  5 07:10:11 np0005546909 podman[231681]: 2025-12-05 12:10:11.953503706 +0000 UTC m=+0.045385931 container died 73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:10:11 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e-userdata-shm.mount: Deactivated successfully.
Dec  5 07:10:11 np0005546909 systemd[1]: var-lib-containers-storage-overlay-db0958bae067600c1586deb306b205beef4d4a15d45a054b88ed994a15bf001d-merged.mount: Deactivated successfully.
Dec  5 07:10:12 np0005546909 podman[231681]: 2025-12-05 12:10:12.00462799 +0000 UTC m=+0.096510185 container cleanup 73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:10:12 np0005546909 systemd[1]: libpod-conmon-73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e.scope: Deactivated successfully.
Dec  5 07:10:12 np0005546909 podman[231710]: 2025-12-05 12:10:12.085466084 +0000 UTC m=+0.058989260 container remove 73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec  5 07:10:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.098 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf92faa-2035-4ee9-ae13-79398681ee19]: (4, ('Fri Dec  5 12:10:11 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e)\n73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e\nFri Dec  5 12:10:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e)\n73d6e318f6f61d863eb4e210dfb9746ac733df9e68c06ffa31bb94eb294cca0e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.100 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[073e7487-2007-4378-abde-401f1ec0740c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.100 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:12 np0005546909 kernel: tap7be4540a-00: left promiscuous mode
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.118 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.120 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1040e75b-41ce-4f8c-bb86-8fedfffafb9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.136 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c20400fd-7934-4e8b-baf9-0419fa691322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.137 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[839a3dae-5c0f-49e4-87fa-016011b855a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.153 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8805a3f7-3dc2-436f-8d93-19d305799144]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397461, 'reachable_time': 16180, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231742, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.155 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:10:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:12.155 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[40daec45-7d32-430a-97fc-dd5463050dc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:12 np0005546909 systemd[1]: run-netns-ovnmeta\x2d7be4540a\x2d0e0d\x2d45dd\x2d9ed3\x2d2c2701ae3e2c.mount: Deactivated successfully.
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.586 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.633 187212 INFO nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance shutdown successfully after 13 seconds.#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.639 187212 INFO nova.virt.libvirt.driver [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance destroyed successfully.#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.644 187212 INFO nova.virt.libvirt.driver [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance destroyed successfully.#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.644 187212 DEBUG nova.virt.libvirt.vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2012489303',display_name='tempest-ServerDiskConfigTestJSON-server-2012489303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2012489303',id=73,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:09:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-flfq46vv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:09:58Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=159b5354-c124-484f-a8ec-da1abf719114,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.645 187212 DEBUG nova.network.os_vif_util [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.646 187212 DEBUG nova.network.os_vif_util [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.646 187212 DEBUG os_vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.648 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.648 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7370bdd5-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.695 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.698 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.700 187212 INFO os_vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd')#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.701 187212 INFO nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Deleting instance files /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114_del#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.702 187212 INFO nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Deletion of /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114_del complete#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.873 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.874 187212 INFO nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Creating image(s)#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.874 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.875 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.875 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.888 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.950 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.952 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.952 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:12 np0005546909 nova_compute[187208]: 2025-12-05 12:10:12.963 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.018 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.020 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.096 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.110 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk 1073741824" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.112 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.113 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.193 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.197 187212 DEBUG nova.virt.disk.api [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Checking if we can resize image /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.198 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.262 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.264 187212 DEBUG nova.virt.disk.api [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Cannot resize image /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.265 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.265 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Ensure instance console log exists: /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.266 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.266 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.267 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.269 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Start _get_guest_xml network_info=[{"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.275 187212 WARNING nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.282 187212 DEBUG nova.virt.libvirt.host [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.283 187212 DEBUG nova.virt.libvirt.host [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.287 187212 DEBUG nova.virt.libvirt.host [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.287 187212 DEBUG nova.virt.libvirt.host [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.288 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.288 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.289 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.289 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.289 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.289 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.290 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.290 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.290 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.291 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.291 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.291 187212 DEBUG nova.virt.hardware [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.292 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'vcpu_model' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.319 187212 DEBUG nova.virt.libvirt.vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2012489303',display_name='tempest-ServerDiskConfigTestJSON-server-2012489303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2012489303',id=73,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:09:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-flfq46vv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:12Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=159b5354-c124-484f-a8ec-da1abf719114,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.320 187212 DEBUG nova.network.os_vif_util [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.321 187212 DEBUG nova.network.os_vif_util [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.322 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  <uuid>159b5354-c124-484f-a8ec-da1abf719114</uuid>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  <name>instance-00000049</name>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-2012489303</nova:name>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:10:13</nova:creationTime>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:        <nova:user uuid="ef254bb2df0442c6bcadfb3a6861c0e9">tempest-ServerDiskConfigTestJSON-1245488084-project-member</nova:user>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:        <nova:project uuid="e836357870d746e49bc783da7cd3accd">tempest-ServerDiskConfigTestJSON-1245488084</nova:project>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:        <nova:port uuid="7370bdd5-ddf8-40de-9f35-975b8ceab3ef">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <entry name="serial">159b5354-c124-484f-a8ec-da1abf719114</entry>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <entry name="uuid">159b5354-c124-484f-a8ec-da1abf719114</entry>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:ee:f0:e8"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <target dev="tap7370bdd5-dd"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/console.log" append="off"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:10:13 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:10:13 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:10:13 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:10:13 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.328 187212 DEBUG nova.virt.libvirt.vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2012489303',display_name='tempest-ServerDiskConfigTestJSON-server-2012489303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2012489303',id=73,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:09:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-flfq46vv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:12Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=159b5354-c124-484f-a8ec-da1abf719114,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.329 187212 DEBUG nova.network.os_vif_util [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.329 187212 DEBUG nova.network.os_vif_util [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.330 187212 DEBUG os_vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.331 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.331 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.332 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.335 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.335 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7370bdd5-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.336 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7370bdd5-dd, col_values=(('external_ids', {'iface-id': '7370bdd5-ddf8-40de-9f35-975b8ceab3ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:f0:e8', 'vm-uuid': '159b5354-c124-484f-a8ec-da1abf719114'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.337 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:13 np0005546909 NetworkManager[55691]: <info>  [1764936613.3385] manager: (tap7370bdd5-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.340 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.345 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.345 187212 INFO os_vif [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd')#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.375 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936598.3729482, dbbad270-1e3c-41e1-9173-c1b9df0ab2dd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.376 187212 INFO nova.compute.manager [-] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.399 187212 DEBUG nova.compute.manager [None req-b82bb1ab-4def-4890-b17e-e3bf1ca3be4e - - - - - -] [instance: dbbad270-1e3c-41e1-9173-c1b9df0ab2dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.581 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.596 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.597 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] No VIF found with MAC fa:16:3e:ee:f0:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.597 187212 INFO nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Using config drive#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.627 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'ec2_ids' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.675 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'keypairs' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.929 187212 DEBUG nova.compute.manager [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-unplugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.930 187212 DEBUG oslo_concurrency.lockutils [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.931 187212 DEBUG oslo_concurrency.lockutils [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.931 187212 DEBUG oslo_concurrency.lockutils [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.932 187212 DEBUG nova.compute.manager [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] No waiting events found dispatching network-vif-unplugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.932 187212 WARNING nova.compute.manager [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received unexpected event network-vif-unplugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef for instance with vm_state active and task_state rebuild_spawning.#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.932 187212 DEBUG nova.compute.manager [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.933 187212 DEBUG oslo_concurrency.lockutils [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.933 187212 DEBUG oslo_concurrency.lockutils [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.933 187212 DEBUG oslo_concurrency.lockutils [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.934 187212 DEBUG nova.compute.manager [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] No waiting events found dispatching network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:10:13 np0005546909 nova_compute[187208]: 2025-12-05 12:10:13.934 187212 WARNING nova.compute.manager [req-3de435dd-6af7-478e-a45f-f45ce7c3b85a req-171323ec-cd17-49b8-a7a1-a8a38fb355df 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received unexpected event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef for instance with vm_state active and task_state rebuild_spawning.#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.132 187212 INFO nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Creating config drive at /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.140 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyqgyouyp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.274 187212 DEBUG oslo_concurrency.processutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyqgyouyp" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:14 np0005546909 kernel: tap7370bdd5-dd: entered promiscuous mode
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.360 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:14Z|00719|binding|INFO|Claiming lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef for this chassis.
Dec  5 07:10:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:14Z|00720|binding|INFO|7370bdd5-ddf8-40de-9f35-975b8ceab3ef: Claiming fa:16:3e:ee:f0:e8 10.100.0.14
Dec  5 07:10:14 np0005546909 systemd-udevd[231663]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:10:14 np0005546909 NetworkManager[55691]: <info>  [1764936614.3686] manager: (tap7370bdd5-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/279)
Dec  5 07:10:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:14Z|00721|binding|INFO|Setting lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef up in Southbound
Dec  5 07:10:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:14Z|00722|binding|INFO|Setting lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef ovn-installed in OVS
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.374 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:f0:e8 10.100.0.14'], port_security=['fa:16:3e:ee:f0:e8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '159b5354-c124-484f-a8ec-da1abf719114', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7370bdd5-ddf8-40de-9f35-975b8ceab3ef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.375 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7370bdd5-ddf8-40de-9f35-975b8ceab3ef in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c bound to our chassis#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.377 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.378 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:14 np0005546909 NetworkManager[55691]: <info>  [1764936614.3814] device (tap7370bdd5-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:10:14 np0005546909 NetworkManager[55691]: <info>  [1764936614.3828] device (tap7370bdd5-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.386 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.389 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ce0fc5-e216-4c46-87bb-105853d42c46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.390 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7be4540a-01 in ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.392 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7be4540a-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.392 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7abb17c1-76f1-4795-9b8a-2bab28fcddd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.394 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.393 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0a3705-652e-42ea-b22a-f24c94de52ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.409 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb8984d-9981-4223-bf5e-d381f5ba0ceb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 systemd-machined[153543]: New machine qemu-84-instance-00000049.
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.435 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[787a614b-6c4f-4509-bec3-bd94ea35d675]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 systemd[1]: Started Virtual Machine qemu-84-instance-00000049.
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.472 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[066ce48d-6c11-4fff-92f7-6cab2e11e650]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 NetworkManager[55691]: <info>  [1764936614.4804] manager: (tap7be4540a-00): new Veth device (/org/freedesktop/NetworkManager/Devices/280)
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.481 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b1af3abc-d7fd-4ece-a18f-a55bf604fdad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.516 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[aa634e7f-921e-4125-8227-5e0f8343b25c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.520 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6bed1c8b-189d-448b-a7e6-e562019bf80a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 NetworkManager[55691]: <info>  [1764936614.5457] device (tap7be4540a-00): carrier: link connected
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.550 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f2494267-3f0c-4594-b025-93e193f94e7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.569 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1922ccf5-43ed-4e4a-a1a6-5a5a9de53681]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399510, 'reachable_time': 39809, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231810, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.588 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e16e0830-274c-4e4d-b287-00869066bb71]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:4893'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399510, 'tstamp': 399510}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231811, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.605 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[75c55457-84f9-4c05-841b-71dfa13c6eb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7be4540a-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:48:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399510, 'reachable_time': 39809, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231812, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.641 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[68198b47-0448-4dbc-9286-63f85cfc7527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.698 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce09e67-bf41-420f-b698-b51a7bafafef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.699 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.700 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.700 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7be4540a-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.702 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:14 np0005546909 kernel: tap7be4540a-00: entered promiscuous mode
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.704 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:14 np0005546909 NetworkManager[55691]: <info>  [1764936614.7047] manager: (tap7be4540a-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.704 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7be4540a-00, col_values=(('external_ids', {'iface-id': '4dcf8e96-bf04-4914-959a-aad071dfa454'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.705 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:14 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:14Z|00723|binding|INFO|Releasing lport 4dcf8e96-bf04-4914-959a-aad071dfa454 from this chassis (sb_readonly=0)
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.720 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.721 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.721 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4255ba7b-a0e4-4a7e-a88a-c3a760715448]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.722 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.pid.haproxy
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:10:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:14.723 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'env', 'PROCESS_TAG=haproxy-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.898 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 159b5354-c124-484f-a8ec-da1abf719114 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.899 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936614.8985865, 159b5354-c124-484f-a8ec-da1abf719114 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.899 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.902 187212 DEBUG nova.compute.manager [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.902 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.905 187212 INFO nova.virt.libvirt.driver [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance spawned successfully.#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.906 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.955 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.959 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.959 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.959 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.960 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.960 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.960 187212 DEBUG nova.virt.libvirt.driver [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:14 np0005546909 nova_compute[187208]: 2025-12-05 12:10:14.965 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.059 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.061 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936614.9008446, 159b5354-c124-484f-a8ec-da1abf719114 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.062 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] VM Started (Lifecycle Event)#033[00m
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.103 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.108 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.122 187212 DEBUG nova.compute.manager [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.135 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:10:15 np0005546909 podman[231852]: 2025-12-05 12:10:15.107812546 +0000 UTC m=+0.029845945 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.249 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.250 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.250 187212 DEBUG nova.objects.instance [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:10:15 np0005546909 podman[231852]: 2025-12-05 12:10:15.254455605 +0000 UTC m=+0.176488994 container create 13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec  5 07:10:15 np0005546909 systemd[1]: Started libpod-conmon-13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9.scope.
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.314 187212 DEBUG oslo_concurrency.lockutils [None req-68b63dd9-dfcc-41a9-93f0-ad655a8949f4 ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:15 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:10:15 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf1f79bda5e68e5137db659fc06127f1818d3fbe95292e9ac02d6cba31e7d86e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:10:15 np0005546909 podman[231852]: 2025-12-05 12:10:15.427241792 +0000 UTC m=+0.349275201 container init 13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:10:15 np0005546909 podman[231852]: 2025-12-05 12:10:15.433354137 +0000 UTC m=+0.355387516 container start 13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:10:15 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [NOTICE]   (231872) : New worker (231879) forked
Dec  5 07:10:15 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [NOTICE]   (231872) : Loading success.
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.524 187212 DEBUG nova.network.neutron [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updated VIF entry in instance network info cache for port ef99bad5-d092-46f6-9b3a-8225cc233d1e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.525 187212 DEBUG nova.network.neutron [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 54d9605a-998b-4492-afc8-f7a5b0dd4e84] Updating instance_info_cache with network_info: [{"id": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "address": "fa:16:3e:bd:e5:94", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapef99bad5-d0", "ovs_interfaceid": "ef99bad5-d092-46f6-9b3a-8225cc233d1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:10:15 np0005546909 nova_compute[187208]: 2025-12-05 12:10:15.637 187212 DEBUG oslo_concurrency.lockutils [req-a6eca461-8c95-473c-8c08-6a5437fe759d req-418e70d8-7f6f-4426-85e0-4d32ef4a5b99 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-54d9605a-998b-4492-afc8-f7a5b0dd4e84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:10:16 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:16Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bd:e5:94 10.100.0.6
Dec  5 07:10:16 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:16Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bd:e5:94 10.100.0.6
Dec  5 07:10:16 np0005546909 nova_compute[187208]: 2025-12-05 12:10:16.842 187212 DEBUG nova.compute.manager [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:16 np0005546909 nova_compute[187208]: 2025-12-05 12:10:16.842 187212 DEBUG oslo_concurrency.lockutils [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:16 np0005546909 nova_compute[187208]: 2025-12-05 12:10:16.843 187212 DEBUG oslo_concurrency.lockutils [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:16 np0005546909 nova_compute[187208]: 2025-12-05 12:10:16.843 187212 DEBUG oslo_concurrency.lockutils [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:16 np0005546909 nova_compute[187208]: 2025-12-05 12:10:16.843 187212 DEBUG nova.compute.manager [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] No waiting events found dispatching network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:10:16 np0005546909 nova_compute[187208]: 2025-12-05 12:10:16.843 187212 WARNING nova.compute.manager [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received unexpected event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef for instance with vm_state active and task_state None.#033[00m
Dec  5 07:10:16 np0005546909 nova_compute[187208]: 2025-12-05 12:10:16.843 187212 DEBUG nova.compute.manager [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:16 np0005546909 nova_compute[187208]: 2025-12-05 12:10:16.844 187212 DEBUG oslo_concurrency.lockutils [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:16 np0005546909 nova_compute[187208]: 2025-12-05 12:10:16.844 187212 DEBUG oslo_concurrency.lockutils [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:16 np0005546909 nova_compute[187208]: 2025-12-05 12:10:16.844 187212 DEBUG oslo_concurrency.lockutils [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:16 np0005546909 nova_compute[187208]: 2025-12-05 12:10:16.844 187212 DEBUG nova.compute.manager [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] No waiting events found dispatching network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:10:16 np0005546909 nova_compute[187208]: 2025-12-05 12:10:16.844 187212 WARNING nova.compute.manager [req-650816e8-7cd4-490b-adcf-632ea2349553 req-f3568107-59bb-4bb2-8168-4a904df7fb90 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received unexpected event network-vif-plugged-7370bdd5-ddf8-40de-9f35-975b8ceab3ef for instance with vm_state active and task_state None.#033[00m
Dec  5 07:10:17 np0005546909 nova_compute[187208]: 2025-12-05 12:10:17.247 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:17 np0005546909 nova_compute[187208]: 2025-12-05 12:10:17.742 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:17 np0005546909 nova_compute[187208]: 2025-12-05 12:10:17.742 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:17 np0005546909 nova_compute[187208]: 2025-12-05 12:10:17.878 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:10:17 np0005546909 nova_compute[187208]: 2025-12-05 12:10:17.998 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:17.999 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.006 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.006 187212 INFO nova.compute.claims [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.099 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.130 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "4053596b-9c68-4044-bb28-5f57016c8e62" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.131 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "4053596b-9c68-4044-bb28-5f57016c8e62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.156 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.254 187212 DEBUG nova.compute.provider_tree [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.280 187212 DEBUG nova.scheduler.client.report [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.286 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.331 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.332 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.335 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.338 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.342 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.342 187212 INFO nova.compute.claims [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.409 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.410 187212 DEBUG nova.network.neutron [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.440 187212 INFO nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.553 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.660 187212 DEBUG nova.compute.provider_tree [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.670 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.671 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.672 187212 INFO nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Creating image(s)#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.672 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.672 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.673 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.689 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.710 187212 DEBUG nova.scheduler.client.report [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.748 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.750 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.751 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.763 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.808 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.473s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.809 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.817 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.818 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.874 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.875 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.875 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.902 187212 DEBUG nova.policy [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.930 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.931 187212 DEBUG nova.virt.disk.api [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Checking if we can resize image /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.931 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.992 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.993 187212 DEBUG nova.virt.disk.api [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Cannot resize image /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:10:18 np0005546909 nova_compute[187208]: 2025-12-05 12:10:18.993 187212 DEBUG nova.objects.instance [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'migration_context' on Instance uuid ecc25cb4-5b3a-43f7-949d-ca9a1a19056a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.027 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.028 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Ensure instance console log exists: /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.028 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.029 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.029 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:19 np0005546909 podman[231908]: 2025-12-05 12:10:19.210781418 +0000 UTC m=+0.056527129 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.217 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.235 187212 INFO nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.263 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.447 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.449 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.449 187212 INFO nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Creating image(s)#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.450 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "/var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.450 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "/var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.451 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "/var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.463 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.517 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.518 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.519 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.531 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.587 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.589 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.625 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.627 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.627 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.691 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.692 187212 DEBUG nova.virt.disk.api [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Checking if we can resize image /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.692 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.760 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.761 187212 DEBUG nova.virt.disk.api [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Cannot resize image /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.762 187212 DEBUG nova.objects.instance [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lazy-loading 'migration_context' on Instance uuid 4053596b-9c68-4044-bb28-5f57016c8e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.832 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.833 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Ensure instance console log exists: /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.834 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.835 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.835 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.838 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.845 187212 WARNING nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.852 187212 DEBUG nova.virt.libvirt.host [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.853 187212 DEBUG nova.virt.libvirt.host [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.857 187212 DEBUG nova.virt.libvirt.host [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.857 187212 DEBUG nova.virt.libvirt.host [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.858 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.858 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.859 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.859 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.859 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.859 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.859 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.860 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.860 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.860 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.860 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.860 187212 DEBUG nova.virt.hardware [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.864 187212 DEBUG nova.objects.instance [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4053596b-9c68-4044-bb28-5f57016c8e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.890 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.891 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.891 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "159b5354-c124-484f-a8ec-da1abf719114-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.893 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.893 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.895 187212 INFO nova.compute.manager [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Terminating instance#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.896 187212 DEBUG nova.compute.manager [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:10:19 np0005546909 kernel: tap7370bdd5-dd (unregistering): left promiscuous mode
Dec  5 07:10:19 np0005546909 NetworkManager[55691]: <info>  [1764936619.9214] device (tap7370bdd5-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.922 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  <uuid>4053596b-9c68-4044-bb28-5f57016c8e62</uuid>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  <name>instance-0000004c</name>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServersAaction247Test-server-1533885512</nova:name>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:10:19</nova:creationTime>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:10:19 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:        <nova:user uuid="b63e7b2645a24842a40a218743fdda6f">tempest-ServersAaction247Test-924836898-project-member</nova:user>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:        <nova:project uuid="b6854395cda4464cb303b7eb51b4e4f1">tempest-ServersAaction247Test-924836898</nova:project>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <entry name="serial">4053596b-9c68-4044-bb28-5f57016c8e62</entry>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <entry name="uuid">4053596b-9c68-4044-bb28-5f57016c8e62</entry>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.config"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/console.log" append="off"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:10:19 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:10:19 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:10:19 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:10:19 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:10:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:19Z|00724|binding|INFO|Releasing lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef from this chassis (sb_readonly=0)
Dec  5 07:10:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:19Z|00725|binding|INFO|Setting lport 7370bdd5-ddf8-40de-9f35-975b8ceab3ef down in Southbound
Dec  5 07:10:19 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:19Z|00726|binding|INFO|Removing iface tap7370bdd5-dd ovn-installed in OVS
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.927 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:19.936 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:f0:e8 10.100.0.14'], port_security=['fa:16:3e:ee:f0:e8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '159b5354-c124-484f-a8ec-da1abf719114', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e836357870d746e49bc783da7cd3accd', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1af73b4f-9447-4eb7-8c28-431fbbf8ffed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb3a2dd0-ca0e-4595-a83a-975a07395638, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=7370bdd5-ddf8-40de-9f35-975b8ceab3ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:10:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:19.938 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 7370bdd5-ddf8-40de-9f35-975b8ceab3ef in datapath 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c unbound from our chassis#033[00m
Dec  5 07:10:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:19.940 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:10:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:19.941 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e681993d-4b5b-4812-bc3f-9e275a9d91bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:19.941 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c namespace which is not needed anymore#033[00m
Dec  5 07:10:19 np0005546909 nova_compute[187208]: 2025-12-05 12:10:19.942 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:19 np0005546909 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d00000049.scope: Deactivated successfully.
Dec  5 07:10:19 np0005546909 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d00000049.scope: Consumed 5.521s CPU time.
Dec  5 07:10:19 np0005546909 systemd-machined[153543]: Machine qemu-84-instance-00000049 terminated.
Dec  5 07:10:20 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [NOTICE]   (231872) : haproxy version is 2.8.14-c23fe91
Dec  5 07:10:20 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [NOTICE]   (231872) : path to executable is /usr/sbin/haproxy
Dec  5 07:10:20 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [WARNING]  (231872) : Exiting Master process...
Dec  5 07:10:20 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [ALERT]    (231872) : Current worker (231879) exited with code 143 (Terminated)
Dec  5 07:10:20 np0005546909 neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c[231868]: [WARNING]  (231872) : All workers exited. Exiting... (0)
Dec  5 07:10:20 np0005546909 systemd[1]: libpod-13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9.scope: Deactivated successfully.
Dec  5 07:10:20 np0005546909 podman[231973]: 2025-12-05 12:10:20.071057869 +0000 UTC m=+0.047599564 container died 13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec  5 07:10:20 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9-userdata-shm.mount: Deactivated successfully.
Dec  5 07:10:20 np0005546909 systemd[1]: var-lib-containers-storage-overlay-cf1f79bda5e68e5137db659fc06127f1818d3fbe95292e9ac02d6cba31e7d86e-merged.mount: Deactivated successfully.
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.152 187212 INFO nova.virt.libvirt.driver [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Instance destroyed successfully.#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.153 187212 DEBUG nova.objects.instance [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'resources' on Instance uuid 159b5354-c124-484f-a8ec-da1abf719114 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:20 np0005546909 podman[231973]: 2025-12-05 12:10:20.238579005 +0000 UTC m=+0.215120700 container cleanup 13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:10:20 np0005546909 systemd[1]: libpod-conmon-13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9.scope: Deactivated successfully.
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.275 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.277 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.278 187212 INFO nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Using config drive#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.280 187212 DEBUG nova.virt.libvirt.vif [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:09:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2012489303',display_name='tempest-ServerDiskConfigTestJSON-server-2012489303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2012489303',id=73,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:10:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-flfq46vv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:10:15Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=159b5354-c124-484f-a8ec-da1abf719114,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.281 187212 DEBUG nova.network.os_vif_util [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "address": "fa:16:3e:ee:f0:e8", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7370bdd5-dd", "ovs_interfaceid": "7370bdd5-ddf8-40de-9f35-975b8ceab3ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.282 187212 DEBUG nova.network.os_vif_util [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.282 187212 DEBUG os_vif [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.283 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.284 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7370bdd5-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.286 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.288 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.290 187212 INFO os_vif [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:f0:e8,bridge_name='br-int',has_traffic_filtering=True,id=7370bdd5-ddf8-40de-9f35-975b8ceab3ef,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7370bdd5-dd')#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.291 187212 INFO nova.virt.libvirt.driver [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Deleting instance files /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114_del#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.291 187212 INFO nova.virt.libvirt.driver [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Deletion of /var/lib/nova/instances/159b5354-c124-484f-a8ec-da1abf719114_del complete#033[00m
Dec  5 07:10:20 np0005546909 podman[232020]: 2025-12-05 12:10:20.320776649 +0000 UTC m=+0.058107195 container remove 13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:10:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.327 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[41378f74-c5a7-4b66-87ef-1da9a931676a]: (4, ('Fri Dec  5 12:10:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9)\n13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9\nFri Dec  5 12:10:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c (13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9)\n13d3488120d4cfe26e3dc0bc33a4dadfd24e9eae2e8c42a0e8341b79e53a80f9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.328 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd59179-d43e-45f4-ac17-91617f2d5be1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.329 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7be4540a-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.331 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:20 np0005546909 kernel: tap7be4540a-00: left promiscuous mode
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.344 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.347 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a2c17c87-8e9a-432a-a038-4c0341f3ed9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.371 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d3eebf74-5cdc-4df1-ac4c-ebf55a2e1ebe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.373 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7d675916-6596-4142-9ddc-95862886c549]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.391 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8129ba88-6313-4765-b2c7-5600d9f6cb37]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399503, 'reachable_time': 29416, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232033, 'error': None, 'target': 'ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.394 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:10:20 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:20.394 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[6d160eae-f368-4ee2-a6f6-b129be714556]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:20 np0005546909 systemd[1]: run-netns-ovnmeta\x2d7be4540a\x2d0e0d\x2d45dd\x2d9ed3\x2d2c2701ae3e2c.mount: Deactivated successfully.
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.472 187212 INFO nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Creating config drive at /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.config#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.478 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw3wxhtg3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.514 187212 INFO nova.compute.manager [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Took 0.62 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.515 187212 DEBUG oslo.service.loopingcall [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.516 187212 DEBUG nova.compute.manager [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.516 187212 DEBUG nova.network.neutron [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:10:20 np0005546909 nova_compute[187208]: 2025-12-05 12:10:20.609 187212 DEBUG oslo_concurrency.processutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw3wxhtg3" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:20 np0005546909 systemd-machined[153543]: New machine qemu-85-instance-0000004c.
Dec  5 07:10:20 np0005546909 systemd[1]: Started Virtual Machine qemu-85-instance-0000004c.
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.026 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.115 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936621.1148732, 4053596b-9c68-4044-bb28-5f57016c8e62 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.115 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.117 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.118 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.122 187212 INFO nova.virt.libvirt.driver [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Instance spawned successfully.#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.122 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.135 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.141 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.144 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.144 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.145 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.145 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.146 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.146 187212 DEBUG nova.virt.libvirt.driver [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.188 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.188 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936621.1160429, 4053596b-9c68-4044-bb28-5f57016c8e62 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.188 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] VM Started (Lifecycle Event)#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.223 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.226 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.231 187212 INFO nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Took 1.78 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.231 187212 DEBUG nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.254 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.280 187212 INFO nova.compute.manager [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Took 3.01 seconds to build instance.#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.299 187212 DEBUG oslo_concurrency.lockutils [None req-89feba30-9683-44f5-9899-4de6b89d1c46 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "4053596b-9c68-4044-bb28-5f57016c8e62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.411 187212 DEBUG nova.network.neutron [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.432 187212 INFO nova.compute.manager [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Took 0.92 seconds to deallocate network for instance.#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.484 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.484 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.598 187212 DEBUG nova.compute.provider_tree [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.621 187212 DEBUG nova.scheduler.client.report [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.649 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.683 187212 INFO nova.scheduler.client.report [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Deleted allocations for instance 159b5354-c124-484f-a8ec-da1abf719114#033[00m
Dec  5 07:10:21 np0005546909 nova_compute[187208]: 2025-12-05 12:10:21.749 187212 DEBUG oslo_concurrency.lockutils [None req-c2c0c618-b8bc-4706-b47d-d8f22ebc54ef ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "159b5354-c124-484f-a8ec-da1abf719114" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.858s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:22 np0005546909 nova_compute[187208]: 2025-12-05 12:10:22.743 187212 DEBUG nova.network.neutron [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Successfully created port: 88e41011-3ebc-4215-ad20-58a49d31a6d4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:10:22 np0005546909 nova_compute[187208]: 2025-12-05 12:10:22.834 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:23 np0005546909 nova_compute[187208]: 2025-12-05 12:10:23.104 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:23 np0005546909 nova_compute[187208]: 2025-12-05 12:10:23.525 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:23 np0005546909 nova_compute[187208]: 2025-12-05 12:10:23.612 187212 DEBUG nova.compute.manager [req-fdfa2c01-e50d-4f1d-830d-2946122a78dd req-3d143c0c-c39f-4999-8f91-3139f6e0c2ee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Received event network-vif-deleted-7370bdd5-ddf8-40de-9f35-975b8ceab3ef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.111 187212 DEBUG nova.compute.manager [None req-a3163107-84b9-4927-8b9f-43a19478e70c b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.150 187212 INFO nova.compute.manager [None req-a3163107-84b9-4927-8b9f-43a19478e70c b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] instance snapshotting#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.151 187212 DEBUG nova.objects.instance [None req-a3163107-84b9-4927-8b9f-43a19478e70c b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lazy-loading 'flavor' on Instance uuid 4053596b-9c68-4044-bb28-5f57016c8e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.512 187212 INFO nova.virt.libvirt.driver [None req-a3163107-84b9-4927-8b9f-43a19478e70c b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Beginning live snapshot process#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.609 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.609 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "8fe1c6df-f787-4c56-b3e7-899cf5e9f723" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.624 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.663 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "4053596b-9c68-4044-bb28-5f57016c8e62" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.664 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "4053596b-9c68-4044-bb28-5f57016c8e62" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.664 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "4053596b-9c68-4044-bb28-5f57016c8e62-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.664 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "4053596b-9c68-4044-bb28-5f57016c8e62-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.664 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "4053596b-9c68-4044-bb28-5f57016c8e62-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.665 187212 INFO nova.compute.manager [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Terminating instance#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.666 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "refresh_cache-4053596b-9c68-4044-bb28-5f57016c8e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.666 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquired lock "refresh_cache-4053596b-9c68-4044-bb28-5f57016c8e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.666 187212 DEBUG nova.network.neutron [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.687 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.688 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.696 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.696 187212 INFO nova.compute.claims [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.855 187212 DEBUG nova.compute.provider_tree [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.873 187212 DEBUG nova.scheduler.client.report [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.899 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.900 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.953 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.953 187212 DEBUG nova.network.neutron [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.974 187212 INFO nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:10:24 np0005546909 nova_compute[187208]: 2025-12-05 12:10:24.990 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.094 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.095 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.095 187212 INFO nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Creating image(s)#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.096 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.096 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.097 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.108 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.198 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.200 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.200 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.215 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:25 np0005546909 podman[232062]: 2025-12-05 12:10:25.236148231 +0000 UTC m=+0.080603959 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.236 187212 DEBUG nova.network.neutron [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.281 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.282 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.300 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.335 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.336 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.337 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.337 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.392 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.394 187212 DEBUG nova.virt.disk.api [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Checking if we can resize image /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.394 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.415 187212 DEBUG nova.policy [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef254bb2df0442c6bcadfb3a6861c0e9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e836357870d746e49bc783da7cd3accd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.457 187212 DEBUG oslo_concurrency.processutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.458 187212 DEBUG nova.virt.disk.api [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Cannot resize image /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.459 187212 DEBUG nova.objects.instance [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'migration_context' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.477 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.478 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Ensure instance console log exists: /var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.478 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.478 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.479 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.638 187212 DEBUG nova.compute.manager [None req-a3163107-84b9-4927-8b9f-43a19478e70c b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Instance disappeared during snapshot _snapshot_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:4390#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.694 187212 DEBUG nova.network.neutron [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.724 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Releasing lock "refresh_cache-4053596b-9c68-4044-bb28-5f57016c8e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.725 187212 DEBUG nova.compute.manager [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:10:25 np0005546909 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Dec  5 07:10:25 np0005546909 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d0000004c.scope: Consumed 5.116s CPU time.
Dec  5 07:10:25 np0005546909 systemd-machined[153543]: Machine qemu-85-instance-0000004c terminated.
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.987 187212 INFO nova.virt.libvirt.driver [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Instance destroyed successfully.#033[00m
Dec  5 07:10:25 np0005546909 nova_compute[187208]: 2025-12-05 12:10:25.988 187212 DEBUG nova.objects.instance [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lazy-loading 'resources' on Instance uuid 4053596b-9c68-4044-bb28-5f57016c8e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.017 187212 INFO nova.virt.libvirt.driver [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Deleting instance files /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62_del#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.018 187212 INFO nova.virt.libvirt.driver [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Deletion of /var/lib/nova/instances/4053596b-9c68-4044-bb28-5f57016c8e62_del complete#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.086 187212 INFO nova.compute.manager [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.086 187212 DEBUG oslo.service.loopingcall [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.087 187212 DEBUG nova.compute.manager [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.087 187212 DEBUG nova.network.neutron [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.186 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "e689e2f0-16e9-402a-986e-a769d72fa0bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.187 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "e689e2f0-16e9-402a-986e-a769d72fa0bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.205 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.275 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.275 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.276 187212 DEBUG nova.network.neutron [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Successfully updated port: 88e41011-3ebc-4215-ad20-58a49d31a6d4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.285 187212 DEBUG nova.virt.hardware [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.285 187212 INFO nova.compute.claims [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.291 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.292 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquired lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.292 187212 DEBUG nova.network.neutron [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.321 187212 DEBUG nova.compute.manager [None req-a3163107-84b9-4927-8b9f-43a19478e70c b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Found 0 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.355 187212 DEBUG nova.network.neutron [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.375 187212 DEBUG nova.network.neutron [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.389 187212 INFO nova.compute.manager [-] [instance: 4053596b-9c68-4044-bb28-5f57016c8e62] Took 0.30 seconds to deallocate network for instance.#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.437 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.456 187212 DEBUG nova.compute.manager [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Received event network-changed-88e41011-3ebc-4215-ad20-58a49d31a6d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.457 187212 DEBUG nova.compute.manager [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Refreshing instance network info cache due to event network-changed-88e41011-3ebc-4215-ad20-58a49d31a6d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.457 187212 DEBUG oslo_concurrency.lockutils [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.518 187212 DEBUG nova.compute.provider_tree [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.534 187212 DEBUG nova.scheduler.client.report [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.555 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.556 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.559 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.612 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.612 187212 DEBUG nova.network.neutron [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.634 187212 INFO nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.666 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.696 187212 DEBUG nova.compute.provider_tree [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.740 187212 DEBUG nova.scheduler.client.report [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.774 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.925 187212 DEBUG nova.compute.manager [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.927 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.928 187212 INFO nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Creating image(s)#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.929 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "/var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.929 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "/var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.930 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "/var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:26 np0005546909 nova_compute[187208]: 2025-12-05 12:10:26.956 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.053 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.055 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.056 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.081 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.168 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.170 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.441 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk 1073741824" returned: 0 in 0.271s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.442 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.443 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.505 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.507 187212 DEBUG nova.virt.disk.api [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Checking if we can resize image /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.508 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.556 187212 INFO nova.scheduler.client.report [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Deleted allocations for instance 4053596b-9c68-4044-bb28-5f57016c8e62#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.572 187212 DEBUG oslo_concurrency.processutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.573 187212 DEBUG nova.virt.disk.api [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Cannot resize image /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.573 187212 DEBUG nova.objects.instance [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lazy-loading 'migration_context' on Instance uuid e689e2f0-16e9-402a-986e-a769d72fa0bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.580 187212 DEBUG nova.network.neutron [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.620 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.621 187212 DEBUG nova.virt.libvirt.driver [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Ensure instance console log exists: /var/lib/nova/instances/e689e2f0-16e9-402a-986e-a769d72fa0bd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.621 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.622 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.622 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.758 187212 DEBUG oslo_concurrency.lockutils [None req-a1775fe4-3051-49cd-9a3b-f4c04f440a59 b63e7b2645a24842a40a218743fdda6f b6854395cda4464cb303b7eb51b4e4f1 - - default default] Lock "4053596b-9c68-4044-bb28-5f57016c8e62" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:27 np0005546909 nova_compute[187208]: 2025-12-05 12:10:27.987 187212 DEBUG nova.policy [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b8b32a7fde5424795b54914a14028b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7e3f3e747de24befad6008f67eb551ae', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:10:28 np0005546909 nova_compute[187208]: 2025-12-05 12:10:28.104 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:28 np0005546909 nova_compute[187208]: 2025-12-05 12:10:28.758 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:28 np0005546909 nova_compute[187208]: 2025-12-05 12:10:28.776 187212 DEBUG nova.network.neutron [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Successfully created port: 8c343187-712d-4aee-9c47-18497ec1042e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.074 187212 DEBUG nova.network.neutron [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updating instance_info_cache with network_info: [{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.097 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Releasing lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.097 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Instance network_info: |[{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.097 187212 DEBUG oslo_concurrency.lockutils [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.098 187212 DEBUG nova.network.neutron [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Refreshing network info cache for port 88e41011-3ebc-4215-ad20-58a49d31a6d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.101 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Start _get_guest_xml network_info=[{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.106 187212 WARNING nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.110 187212 DEBUG nova.virt.libvirt.host [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.110 187212 DEBUG nova.virt.libvirt.host [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.117 187212 DEBUG nova.virt.libvirt.host [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.117 187212 DEBUG nova.virt.libvirt.host [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.118 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.118 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.118 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.119 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.119 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.119 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.119 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.120 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.120 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.120 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.120 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.121 187212 DEBUG nova.virt.hardware [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.124 187212 DEBUG nova.virt.libvirt.vif [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-914539058',display_name='tempest-tempest.common.compute-instance-914539058',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-914539058',id=75,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-fs97ng4t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=ecc25cb4-5b3a-43f7-949d-ca9a1a19056a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.125 187212 DEBUG nova.network.os_vif_util [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.126 187212 DEBUG nova.network.os_vif_util [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:40:d1,bridge_name='br-int',has_traffic_filtering=True,id=88e41011-3ebc-4215-ad20-58a49d31a6d4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88e41011-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.127 187212 DEBUG nova.objects.instance [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lazy-loading 'pci_devices' on Instance uuid ecc25cb4-5b3a-43f7-949d-ca9a1a19056a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.192 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  <uuid>ecc25cb4-5b3a-43f7-949d-ca9a1a19056a</uuid>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  <name>instance-0000004b</name>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <nova:name>tempest-tempest.common.compute-instance-914539058</nova:name>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:10:30</nova:creationTime>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:        <nova:user uuid="242b773b0af24caf814e2a84178332d5">tempest-AttachInterfacesTestJSON-755891038-project-member</nova:user>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:        <nova:project uuid="98681240c47b41cba28d91e1c11fd71f">tempest-AttachInterfacesTestJSON-755891038</nova:project>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:        <nova:port uuid="88e41011-3ebc-4215-ad20-58a49d31a6d4">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <entry name="serial">ecc25cb4-5b3a-43f7-949d-ca9a1a19056a</entry>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <entry name="uuid">ecc25cb4-5b3a-43f7-949d-ca9a1a19056a</entry>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.config"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:a2:40:d1"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <target dev="tap88e41011-3e"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/console.log" append="off"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:10:30 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:10:30 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:10:30 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:10:30 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.194 187212 DEBUG nova.compute.manager [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Preparing to wait for external event network-vif-plugged-88e41011-3ebc-4215-ad20-58a49d31a6d4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.195 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Acquiring lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.195 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.195 187212 DEBUG oslo_concurrency.lockutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Lock "ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.196 187212 DEBUG nova.virt.libvirt.vif [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-914539058',display_name='tempest-tempest.common.compute-instance-914539058',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-914539058',id=75,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOm03qWFSQ5103OHzXmrPAuroPlPlASDWpjaAFBZ67fEn8dhFJDy86s09scxA4Z1QJ5SyM81uczE2e6po9G16NpMT9VelctfScju7FTjnSWfqAVLfNhDpaQwjgP9O1/MXQ==',key_name='tempest-keypair-1446846217',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98681240c47b41cba28d91e1c11fd71f',ramdisk_id='',reservation_id='r-fs97ng4t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-755891038',owner_user_name='tempest-AttachInterfacesTestJSON-755891038-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='242b773b0af24caf814e2a84178332d5',uuid=ecc25cb4-5b3a-43f7-949d-ca9a1a19056a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.196 187212 DEBUG nova.network.os_vif_util [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converting VIF {"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.197 187212 DEBUG nova.network.os_vif_util [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:40:d1,bridge_name='br-int',has_traffic_filtering=True,id=88e41011-3ebc-4215-ad20-58a49d31a6d4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88e41011-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.197 187212 DEBUG os_vif [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:40:d1,bridge_name='br-int',has_traffic_filtering=True,id=88e41011-3ebc-4215-ad20-58a49d31a6d4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88e41011-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.198 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.198 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.199 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.203 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88e41011-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.204 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap88e41011-3e, col_values=(('external_ids', {'iface-id': '88e41011-3ebc-4215-ad20-58a49d31a6d4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:40:d1', 'vm-uuid': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.205 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:30 np0005546909 NetworkManager[55691]: <info>  [1764936630.2064] manager: (tap88e41011-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.208 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.216 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.217 187212 INFO os_vif [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:40:d1,bridge_name='br-int',has_traffic_filtering=True,id=88e41011-3ebc-4215-ad20-58a49d31a6d4,network=Network(fbfed6fc-3701-4311-a4c2-8c49c5b7584c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap88e41011-3e')#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.279 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.280 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.280 187212 DEBUG nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] No VIF found with MAC fa:16:3e:a2:40:d1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.281 187212 INFO nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Using config drive#033[00m
Dec  5 07:10:30 np0005546909 nova_compute[187208]: 2025-12-05 12:10:30.964 187212 DEBUG nova.network.neutron [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Successfully created port: 10dc6775-d9c9-40ca-bd05-41c56cffc744 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:10:31 np0005546909 nova_compute[187208]: 2025-12-05 12:10:31.663 187212 INFO nova.virt.libvirt.driver [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Creating config drive at /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.config#033[00m
Dec  5 07:10:31 np0005546909 nova_compute[187208]: 2025-12-05 12:10:31.671 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl7xy1vkq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:31 np0005546909 nova_compute[187208]: 2025-12-05 12:10:31.805 187212 DEBUG oslo_concurrency.processutils [None req-591d5134-9b2b-4aaf-8ec4-05fcf1105bb6 242b773b0af24caf814e2a84178332d5 98681240c47b41cba28d91e1c11fd71f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl7xy1vkq" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:31 np0005546909 nova_compute[187208]: 2025-12-05 12:10:31.806 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "30cb83d4-3a34-4420-bc83-099b266da48c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:31 np0005546909 nova_compute[187208]: 2025-12-05 12:10:31.807 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "30cb83d4-3a34-4420-bc83-099b266da48c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:31 np0005546909 kernel: tap88e41011-3e: entered promiscuous mode
Dec  5 07:10:31 np0005546909 NetworkManager[55691]: <info>  [1764936631.8747] manager: (tap88e41011-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/283)
Dec  5 07:10:31 np0005546909 nova_compute[187208]: 2025-12-05 12:10:31.874 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:31 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:31Z|00727|binding|INFO|Claiming lport 88e41011-3ebc-4215-ad20-58a49d31a6d4 for this chassis.
Dec  5 07:10:31 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:31Z|00728|binding|INFO|88e41011-3ebc-4215-ad20-58a49d31a6d4: Claiming fa:16:3e:a2:40:d1 10.100.0.8
Dec  5 07:10:31 np0005546909 nova_compute[187208]: 2025-12-05 12:10:31.888 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:31 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:31Z|00729|binding|INFO|Setting lport 88e41011-3ebc-4215-ad20-58a49d31a6d4 ovn-installed in OVS
Dec  5 07:10:31 np0005546909 nova_compute[187208]: 2025-12-05 12:10:31.891 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:31 np0005546909 systemd-udevd[232140]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:10:31 np0005546909 systemd-machined[153543]: New machine qemu-86-instance-0000004b.
Dec  5 07:10:31 np0005546909 NetworkManager[55691]: <info>  [1764936631.9234] device (tap88e41011-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:10:31 np0005546909 NetworkManager[55691]: <info>  [1764936631.9239] device (tap88e41011-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:10:31 np0005546909 nova_compute[187208]: 2025-12-05 12:10:31.925 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:10:31 np0005546909 systemd[1]: Started Virtual Machine qemu-86-instance-0000004b.
Dec  5 07:10:31 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:31Z|00730|binding|INFO|Setting lport 88e41011-3ebc-4215-ad20-58a49d31a6d4 up in Southbound
Dec  5 07:10:31 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:31.972 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:40:d1 10.100.0.8'], port_security=['fa:16:3e:a2:40:d1 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98681240c47b41cba28d91e1c11fd71f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cbdd2780-9e2b-4e10-8d0a-98de936cf6ae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c94839a8-8979-4909-a8e1-cdd384f46390, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=88e41011-3ebc-4215-ad20-58a49d31a6d4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:10:31 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:31.973 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 88e41011-3ebc-4215-ad20-58a49d31a6d4 in datapath fbfed6fc-3701-4311-a4c2-8c49c5b7584c bound to our chassis#033[00m
Dec  5 07:10:31 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:31.976 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbfed6fc-3701-4311-a4c2-8c49c5b7584c#033[00m
Dec  5 07:10:31 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:31.996 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aa6ac8ca-bed6-4e05-a210-486c26fa2f34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.045 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c11f060c-ae5e-464c-a0ce-ec6c36694023]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.049 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5d8482a5-9ca7-4c29-acdb-0ee87c314caf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.090 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[fe5b0d6b-6549-48b9-94d8-d95c64120559]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.111 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[84afbee7-499f-47dd-9020-7d22fbb8bbb4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbfed6fc-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:26:88:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 398313, 'reachable_time': 28299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232155, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.131 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb50bd2-16a7-4e91-9ff7-967d06bebc1b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398326, 'tstamp': 398326}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232156, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfbfed6fc-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 398330, 'tstamp': 398330}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232156, 'error': None, 'target': 'ovnmeta-fbfed6fc-3701-4311-a4c2-8c49c5b7584c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:10:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.134 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbfed6fc-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.137 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbfed6fc-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.137 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.137 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.137 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbfed6fc-30, col_values=(('external_ids', {'iface-id': 'c2b03c34-62ec-4644-b043-43f2baa5f384'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:10:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:10:32.138 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.206 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.207 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.221 187212 DEBUG nova.virt.hardware [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.221 187212 INFO nova.compute.claims [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.397 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936632.3973331, ecc25cb4-5b3a-43f7-949d-ca9a1a19056a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.398 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] VM Started (Lifecycle Event)#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.421 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.426 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936632.3975806, ecc25cb4-5b3a-43f7-949d-ca9a1a19056a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.426 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.672 187212 DEBUG nova.compute.provider_tree [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.770 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.771 187212 DEBUG nova.scheduler.client.report [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.779 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.869 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.958 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:32 np0005546909 nova_compute[187208]: 2025-12-05 12:10:32.959 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.094 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.095 187212 DEBUG nova.network.neutron [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.107 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.118 187212 INFO nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.141 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:10:33 np0005546909 podman[232165]: 2025-12-05 12:10:33.218853843 +0000 UTC m=+0.063469379 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.228 187212 DEBUG nova.network.neutron [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updated VIF entry in instance network info cache for port 88e41011-3ebc-4215-ad20-58a49d31a6d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.228 187212 DEBUG nova.network.neutron [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: ecc25cb4-5b3a-43f7-949d-ca9a1a19056a] Updating instance_info_cache with network_info: [{"id": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "address": "fa:16:3e:a2:40:d1", "network": {"id": "fbfed6fc-3701-4311-a4c2-8c49c5b7584c", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1379220417-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98681240c47b41cba28d91e1c11fd71f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap88e41011-3e", "ovs_interfaceid": "88e41011-3ebc-4215-ad20-58a49d31a6d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:10:33 np0005546909 podman[232164]: 2025-12-05 12:10:33.254180724 +0000 UTC m=+0.100643772 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.506 187212 DEBUG oslo_concurrency.lockutils [req-ec4b7ddf-1a0a-4588-8fac-a6d41508a307 req-08385dd1-a131-4827-9113-386a75699a1b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.627 187212 DEBUG nova.compute.manager [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.629 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.630 187212 INFO nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Creating image(s)#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.630 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "/var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.630 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.631 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "/var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.646 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.669 187212 DEBUG nova.policy [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '62153b585ecc4e6fa2ad567851d49081', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c982a61e3fc4c8da9248076bb0361ac', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.708 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.709 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.710 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.722 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.803 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.805 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.946 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk 1073741824" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.947 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:33 np0005546909 nova_compute[187208]: 2025-12-05 12:10:33.948 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.007 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.008 187212 DEBUG nova.virt.disk.api [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Checking if we can resize image /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.009 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.064 187212 DEBUG oslo_concurrency.processutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.065 187212 DEBUG nova.virt.disk.api [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Cannot resize image /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.066 187212 DEBUG nova.objects.instance [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lazy-loading 'migration_context' on Instance uuid 30cb83d4-3a34-4420-bc83-099b266da48c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.085 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.085 187212 DEBUG nova.virt.libvirt.driver [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Ensure instance console log exists: /var/lib/nova/instances/30cb83d4-3a34-4420-bc83-099b266da48c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.086 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.086 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.086 187212 DEBUG oslo_concurrency.lockutils [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.328 187212 DEBUG nova.network.neutron [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Successfully updated port: 8c343187-712d-4aee-9c47-18497ec1042e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.619 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquiring lock "refresh_cache-8fe1c6df-f787-4c56-b3e7-899cf5e9f723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.619 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Acquired lock "refresh_cache-8fe1c6df-f787-4c56-b3e7-899cf5e9f723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.620 187212 DEBUG nova.network.neutron [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.624 187212 DEBUG nova.compute.manager [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Received event network-changed-8c343187-712d-4aee-9c47-18497ec1042e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.624 187212 DEBUG nova.compute.manager [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Refreshing instance network info cache due to event network-changed-8c343187-712d-4aee-9c47-18497ec1042e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:10:34 np0005546909 nova_compute[187208]: 2025-12-05 12:10:34.625 187212 DEBUG oslo_concurrency.lockutils [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-8fe1c6df-f787-4c56-b3e7-899cf5e9f723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:10:35 np0005546909 nova_compute[187208]: 2025-12-05 12:10:35.072 187212 DEBUG nova.network.neutron [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:10:35 np0005546909 nova_compute[187208]: 2025-12-05 12:10:35.151 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936620.150944, 159b5354-c124-484f-a8ec-da1abf719114 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:10:35 np0005546909 nova_compute[187208]: 2025-12-05 12:10:35.152 187212 INFO nova.compute.manager [-] [instance: 159b5354-c124-484f-a8ec-da1abf719114] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:10:35 np0005546909 nova_compute[187208]: 2025-12-05 12:10:35.173 187212 DEBUG nova.compute.manager [None req-7daee835-e878-462a-bd9a-eaed73c7a230 - - - - - -] [instance: 159b5354-c124-484f-a8ec-da1abf719114] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:10:35 np0005546909 nova_compute[187208]: 2025-12-05 12:10:35.206 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.363 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'name': 'tempest-tempest.common.compute-instance-569275018', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '98681240c47b41cba28d91e1c11fd71f', 'user_id': '242b773b0af24caf814e2a84178332d5', 'hostId': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.366 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'name': 'tempest-tempest.common.compute-instance-914539058', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '98681240c47b41cba28d91e1c11fd71f', 'user_id': '242b773b0af24caf814e2a84178332d5', 'hostId': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.367 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.372 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 54d9605a-998b-4492-afc8-f7a5b0dd4e84 / tapef99bad5-d0 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.372 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.375 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ecc25cb4-5b3a-43f7-949d-ca9a1a19056a / tap88e41011-3e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.375 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ee6244f-292c-4898-87a3-40503dceef6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.367882', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '67879fa2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'e3ffc561d974dcff7de17e61f5f0c206ed5d5d5b21464365a7e9b4cbe23860c1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.367882', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '678816a8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': 'b166f8b3ed46eb1d65447ebc6c58fa0a6075dd35772a1c23e254a6062a42d5b9'}]}, 'timestamp': '2025-12-05 12:10:35.376371', '_unique_id': '740dfc1403364ebd81137d408fc591be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.377 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.379 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.379 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.379 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>]
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.379 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.392 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.393 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.403 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.404 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0ceb262-3873-4349-9293-eb3f104a75bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.379804', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '678aa5a8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.000859804, 'message_signature': 'a0d49b49f94a1568fcd16d3188a028ec2f0eb7be686e72c562499bb4203a58cc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.379804', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '678ab6a6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.000859804, 'message_signature': 'c5a642a2c3ec58f75730ae40ee2df8dd7e4281c27729ecefb598d5e5fe604756'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.379804', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '678c509c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.014581006, 'message_signature': 'a561e77b8cd9565b28ec86370f9e2fc168d38a2f92b5e73e03333eea663f03a5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.379804', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '678c5fa6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.014581006, 'message_signature': 'ef3d5574e120c41fa5d21593b949debec714a2cae2ccb0d64d35d57ccdf6f09d'}]}, 'timestamp': '2025-12-05 12:10:35.404436', '_unique_id': '64957bc634fc4c23af8c423065ec5a49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.405 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.406 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.438 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.read.bytes volume: 30243328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.438 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.460 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.460 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89cb980c-7a6a-42f1-a761-af3ae5620973', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30243328, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.406620', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6791a4ca-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '1548c642611bf3a110a517f1c4620ed536faf79b0037d1cef3280d66cc7b7540'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.406620', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6791b014-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '6c3a2584f372d998296bc51de8e3c3d697d8270b2b5a161d6d46afb78cd9c75b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.406620', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6794edc4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '85c2134e0fbc8f4190314b2c10c1963a4d26856e5188d7b7b7f031821e40633b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.406620', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6794f918-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '72346a86546f45a4511e91edbebc5c5ab1304795f234947cd763b360f6a37715'}]}, 'timestamp': '2025-12-05 12:10:35.460809', '_unique_id': '46e9b1a28a894a94b278da1b115fcb42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.461 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.462 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.462 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.462 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.462 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19385ae9-28cc-4eb2-a759-dbb67da93c3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.462500', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679546b6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.000859804, 'message_signature': 'be9e32dbf7949037d8dc30c76b38d7cb6d5fc77985e038b51c7b39a476f888af'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.462500', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67954e2c-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.000859804, 'message_signature': '61ecafe6d24183d5f8ed9117a2fb913cdd082295a0e82555176081ad400f77eb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.462500', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679555de-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.014581006, 'message_signature': 'fd7af1c35f018dfbf1f7eb37edb2f67dd040a763a0fcd50e746af9beb0c42587'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.462500', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67955eee-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.014581006, 'message_signature': 'f8b684081f26e701eda1bd49df3d4cb38e80e6ab8924b5483625ab60c7c0d877'}]}, 'timestamp': '2025-12-05 12:10:35.463335', '_unique_id': '5572a8dc9b944e7ab2451e5317d6fd98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.463 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.464 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.464 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.464 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b733c48-214c-4cf4-9b64-5a3f1a34f76d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.464567', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '679598d2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'b47f5c010f94161ff023667c0d917074483eaa95a12212e3b485b009aa382b95'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.464567', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '6795a200-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': '06bf847c07670f4b1ce8230474f7ca084a4f143fceec78728831335a07b8adb2'}]}, 'timestamp': '2025-12-05 12:10:35.465123', '_unique_id': '54945c73d6ae4c32afce2c943377129f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.465 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.466 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.466 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.466 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1bd2055-1ac1-4184-a690-84b44dc434fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.466530', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '6795e684-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'a4b9a752734e9f5fca0ff6978a17587e6cc3f66a11248f41678a4447d55367e4'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.466530', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '6795f174-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': 'b1cbc7083e0fe37d2c1dc0714dbddecd658d8c1ea91a7e73f6939ca044bda5ed'}]}, 'timestamp': '2025-12-05 12:10:35.467154', '_unique_id': '365223e7c35a475d97cfd9bfcdcb777e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.467 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.468 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.468 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.468 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70735f2b-d94e-4bcc-920e-dfd3f605e54d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.468446', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '67962ff4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': '655fad305fdc328893e77ba38e26605a914f67d61236c5df9085e510cdf5b5c0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.468446', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '67963ba2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': '29e3b4b9e13b8f6cfe10ba4af693a98134c4db006a7dad73d4676b10420abc16'}]}, 'timestamp': '2025-12-05 12:10:35.469048', '_unique_id': '0489860b3dac46fa92f16fd886142e4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.469 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.470 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.470 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.470 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '065b6943-08e3-4236-90f7-196059909052', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.470190', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '679672d4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': '98a569d4056e8509b28a9fff09c6b4fdb7ffc7d4d5d553743b7a5bd3e0399205'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.470190', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '67967aea-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': 'e3864d80c548b559e93ec593a94b7119e50f8c46b3c2cf086f0e770fed6d1246'}]}, 'timestamp': '2025-12-05 12:10:35.470611', '_unique_id': '35a0f617174d42e99813eab47b3ddbba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.read.latency volume: 224419055 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.471 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.read.latency volume: 21240656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.472 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.472 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e41f37ec-9d9b-46e6-a365-e8f9257b6049', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 224419055, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.471688', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6796ad3a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '561a5afdf998a9d58da1e16e7549b69b61c9b2791165efba118404c64ce4c796'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21240656, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.471688', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6796b6f4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '87a135ce881ef8e97ec1f837386a93521244b01cfed4b6c4540ccc937337ab4a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.471688', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6796c05e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '0e00fe7aa7e64e9b10bf3d6be194b14553e560ca13d5467f66cc3a1c44ab612f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.471688', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6796ca86-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': 'b6c7a8a36666584ee721f44943a1e40dadf820cc9176b798c05680741a09c386'}]}, 'timestamp': '2025-12-05 12:10:35.472695', '_unique_id': 'eb5f27ca792a467cb2f116416e6d4a1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.473 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4af9242-7dcb-43a4-bacc-a2b3db599161', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.473890', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '67970352-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'd9b0e44316c5e364c03a88ff3ec612766c39a9d82c4867f618268ce28dc10abf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.473890', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '67970c44-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': '4f4bc838ad0a474169876973fe805ab0863898d46914bd59f136eead41eb320f'}]}, 'timestamp': '2025-12-05 12:10:35.474331', '_unique_id': 'e2e953b41e3c4a7799c12cc766041bb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.474 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.475 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.475 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.475 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>]
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.475 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.475 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.write.bytes volume: 72925184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.476 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.476 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.476 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2efa5b8c-04b1-49af-afe7-624ff3e2509a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72925184, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.475915', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67975366-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': 'f7d4997888e3d61792f37c1c9043f348130d02e84482b8cdfbbc2f72c8bc05e3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.475915', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67975b54-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '029b474f604d4f1a4e0aa5d140f52caa7fae878370a8f87ddc494cbf68217f4e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.475915', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67976464-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': 'f1535f0beb9291c2f018b3de001797c47ed22e33e3aed6d3b1c656f1f7b1b39a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.475915', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67976f0e-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': 'cb7f6f1dad6058a80e48f2c678e2154c3a6fe46676f1b90a60f985379c73495f'}]}, 'timestamp': '2025-12-05 12:10:35.476891', '_unique_id': '974eacf9ed744d8596737b8aa4dcdd7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.477 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.478 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.498 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/cpu volume: 10980000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 nova_compute[187208]: 2025-12-05 12:10:35.508 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.514 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ccbb3d4f-1114-41e7-b69f-1360ff3f219a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10980000000, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'timestamp': '2025-12-05T12:10:35.478470', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '679abc72-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.118920524, 'message_signature': 'fd3ae8049a713f24cd697cbe15e523ecafa0164ab5c37b1f193b5ff3449d7f21'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'timestamp': '2025-12-05T12:10:35.478470', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '679d3088-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.135104737, 'message_signature': 'd8226b0875f21f979e25381e405f9fc6e3faa3a56e000c91173235ad6389fc97'}]}, 'timestamp': '2025-12-05 12:10:35.514716', '_unique_id': '84f847cc3e8c445592e69aa1c7bf6e2e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.515 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.516 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.516 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.read.requests volume: 1092 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.517 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.517 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.517 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cf501e4-ee6b-4a98-ba54-25f835567591', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1092, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.516563', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679da072-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': 'ea289b470b837ddbe402e829ce15f310cd0b73479117aeae2fe5fa398874d205'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.516563', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679da978-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '9486a8555f71c8273a705372ebde2c4c9c37cb0f9b9e4e829c2cbfffa2223be4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.516563', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679db198-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '8f1767f54c8b6efa7060896ed737877200cc4571129bf927d59b45617b1139f1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.516563', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679dbbe8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': 'f959a2b136ee341152d1dd4ebbecd44daefe9226b272d39a73e6c0c93f837eaa'}]}, 'timestamp': '2025-12-05 12:10:35.518173', '_unique_id': 'b93250d030684d56be2c064f43baf9f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.518 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.519 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.519 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.519 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.520 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.520 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38cb8820-abfb-4fa0-8229-db07525f6ede', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.519678', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679e0170-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.000859804, 'message_signature': 'f732ed6ac40479f13aa9c592985070202bd7f56ac7101d8da53873c556dfbe10'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.519678', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679e09d6-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.000859804, 'message_signature': '589087d8cdfbd8e1bf37f6bb33614e4d69c61d1ffb099b5df04dc1171864afb6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.519678', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679e1156-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.014581006, 'message_signature': 'a1a4c21a8bb6bef2ebe713c860037dad1f4c2e8c80e3d19d995f2eb6a0436e65'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.519678', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679e1886-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.014581006, 'message_signature': 'fa88e6517d755748128f5835dd658c56941b5262b9fe8bcbe0e3ec6cdc584ec9'}]}, 'timestamp': '2025-12-05 12:10:35.520511', '_unique_id': '90d2634c16754cbfbc4d1cbf530664d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.521 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.write.latency volume: 4439600897 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.522 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.522 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.522 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63a798d9-14c5-4234-8337-c0355057e7d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4439600897, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.521930', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679e5ae4-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': 'cdbe72b0ad6cc04581435dfb0f110dc5c11481859326d8e3ef98503693509802'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.521930', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679e634a-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': 'b4150a5fbcff8675c543ec9ebf2e58dc7e95fb51d99c8ee3671bd0993dba504a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.521930', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679e6ac0-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '94aee0ec138063744306f1e465c5a09bead883cf06a8c1e5c3bd1626e3a3d932'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.521930', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679e7272-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '93c3ba61e894ce48f9b66f5e0eb24cbfb2636f0b113be071fd8c77c369c92499'}]}, 'timestamp': '2025-12-05 12:10:35.522841', '_unique_id': 'e9ea58007ec14183be4bd4969934756b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.523 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.524 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.524 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/memory.usage volume: 42.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.524 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.524 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance ecc25cb4-5b3a-43f7-949d-ca9a1a19056a: ceilometer.compute.pollsters.NoVolumeException
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15275e60-bc2f-4d30-a644-5829eeb85daf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.671875, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'timestamp': '2025-12-05T12:10:35.524175', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '679eaff8-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.118920524, 'message_signature': 'f89a58207c17fa035917f9ab389baeaf4cf402f50ef2dbf545fb24da9b7a6fa4'}]}, 'timestamp': '2025-12-05 12:10:35.524584', '_unique_id': 'ce09d925a396434bbee1497bd2605141'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.525 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.526 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.526 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd847d934-b825-4922-b3f7-934b45d7c1a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-vda', 'timestamp': '2025-12-05T12:10:35.525741', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679eeca2-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '9660dc6cccee14018ceac31b4abe8b81af231c064a310d7eae99bc00c9fcc218'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84-sda', 'timestamp': '2025-12-05T12:10:35.525741', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'instance-0000004a', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679ef508-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.027700662, 'message_signature': '350e8a9b918f2ff27e7d25b34ea0f4b1262c6837bf512b6ab1d688f2e3967358'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-vda', 'timestamp': '2025-12-05T12:10:35.525741', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '679eff26-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': 'bbdec4bfc4e4a3b93d07bc7cd4a17423637c4d3aa1ee1c17c3b00c33828f21e2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-sda', 'timestamp': '2025-12-05T12:10:35.525741', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'instance-0000004b', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '679f0674-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4016.060235224, 'message_signature': '16ba444e10e4b4d3879dc709f0664aed173e9f6c755d29aa005f06015b7583f6'}]}, 'timestamp': '2025-12-05 12:10:35.526603', '_unique_id': '96256ef8e52a41da9a3760a7d14d462d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.527 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47169a6a-1829-47b4-a3fc-f803c4423029', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.527815', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '679f3e78-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'ae4fc6c8dece74b094dc8e6d2e1722db6ce7edf82c971f5ca2fb73ef300dc6d9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.527815', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '679f4788-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': '85dd77d603730043dd4f0533b29194e0f6d0f4cd1573f7e077bf2355c623c604'}]}, 'timestamp': '2025-12-05 12:10:35.528278', '_unique_id': '84dad1814f174e31a613f22972e412d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.528 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.529 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.529 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>]
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.529 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.529 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.incoming.bytes volume: 1436 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da98dae8-590a-4356-ad8a-0867a7fe42e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1436, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.529723', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '679f8810-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'eb0e184dfbe2a1e74f3eca3392297c20f59d9b6fcf4869ca56a079c46f92b210'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.529723', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '679f9364-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': 'acfac9a011a0d4ca63e4adaa023f051266077597ecdebaf3f9a0c80861c001f1'}]}, 'timestamp': '2025-12-05 12:10:35.530222', '_unique_id': 'b8089e3de2e945e4b3ed412fc80737e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.530 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.531 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.531 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.531 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-569275018>, <NovaLikeServer: tempest-tempest.common.compute-instance-914539058>]
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.531 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.531 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.532 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bf67bc3-4117-410c-87b1-e374d8d8f2bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.531751', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '679fd928-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': '7d839ac6bd8cdb68d7690dd4eab39bd111707ff5dad7e5c4eb6d460fb23f331b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.531751', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '679ff354-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': 'c07c087556f745338d9782bcd1493479594bf9950eb8e7fc7a3fe92e85487e7a'}]}, 'timestamp': '2025-12-05 12:10:35.532816', '_unique_id': '163511e283e644bc8905501a88597ed1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.533 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.534 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.534 12 DEBUG ceilometer.compute.pollsters [-] 54d9605a-998b-4492-afc8-f7a5b0dd4e84/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.534 12 DEBUG ceilometer.compute.pollsters [-] ecc25cb4-5b3a-43f7-949d-ca9a1a19056a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd200daa7-a6df-42f1-94df-abd50a13b6c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004a-54d9605a-998b-4492-afc8-f7a5b0dd4e84-tapef99bad5-d0', 'timestamp': '2025-12-05T12:10:35.534254', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-569275018', 'name': 'tapef99bad5-d0', 'instance_id': '54d9605a-998b-4492-afc8-f7a5b0dd4e84', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:bd:e5:94', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapef99bad5-d0'}, 'message_id': '67a039cc-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.988994734, 'message_signature': 'a8e3f61231709bfb46528ad40286317b0b47d4ec4eb23d7bba65e8d5b88c4594'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '242b773b0af24caf814e2a84178332d5', 'user_name': None, 'project_id': '98681240c47b41cba28d91e1c11fd71f', 'project_name': None, 'resource_id': 'instance-0000004b-ecc25cb4-5b3a-43f7-949d-ca9a1a19056a-tap88e41011-3e', 'timestamp': '2025-12-05T12:10:35.534254', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-914539058', 'name': 'tap88e41011-3e', 'instance_id': 'ecc25cb4-5b3a-43f7-949d-ca9a1a19056a', 'instance_type': 'm1.nano', 'host': 'cf5bfe44c66901ec6eddc3740ac4aad1555c04023f5c3bdec8576322', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'image_ref': 'a6987852-063f-405d-a848-6b382694811e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a2:40:d1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap88e41011-3e'}, 'message_id': '67a041ce-d1d3-11f0-8572-fa163e006c52', 'monotonic_time': 4015.99443237, 'message_signature': 'c95507eddb6a441ce33a8534b202cf83d27be6895fda6f4050e9e6b684716950'}]}, 'timestamp': '2025-12-05 12:10:35.534724', '_unique_id': '61f4357a4af44414aa4473a28ce9d620'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     yield
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec  5 07:10:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:10:35.535 12 ERROR oslo_messaging.notify.messaging 
Dec  5 07:10:35 np0005546909 nova_compute[187208]: 2025-12-05 12:10:35.689 187212 DEBUG nova.network.neutron [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Successfully updated port: 10dc6775-d9c9-40ca-bd05-41c56cffc744 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:10:35 np0005546909 nova_compute[187208]: 2025-12-05 12:10:35.704 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquiring lock "refresh_cache-e689e2f0-16e9-402a-986e-a769d72fa0bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:10:35 np0005546909 nova_compute[187208]: 2025-12-05 12:10:35.704 187212 DEBUG oslo_concurrency.lockutils [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] Acquired lock "refresh_cache-e689e2f0-16e9-402a-986e-a769d72fa0bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:10:35 np0005546909 nova_compute[187208]: 2025-12-05 12:10:35.704 187212 DEBUG nova.network.neutron [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:10:35 np0005546909 nova_compute[187208]: 2025-12-05 12:10:35.799 187212 DEBUG nova.network.neutron [None req-a0c5e45a-2d42-463a-96d1-0ac8a1d7dbe0 62153b585ecc4e6fa2ad567851d49081 0c982a61e3fc4c8da9248076bb0361ac - - default default] [instance: 30cb83d4-3a34-4420-bc83-099b266da48c] Successfully created port: 96dab709-f4e0-48a6-ab76-0b13fdf97017 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.025 187212 DEBUG nova.network.neutron [None req-031c8ab1-c6ca-48a2-bbfa-3cb8cafae462 8b8b32a7fde5424795b54914a14028b5 7e3f3e747de24befad6008f67eb551ae - - default default] [instance: e689e2f0-16e9-402a-986e-a769d72fa0bd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:10:36 np0005546909 ovn_controller[95610]: 2025-12-05T12:10:36Z|00731|binding|INFO|Releasing lport c2b03c34-62ec-4644-b043-43f2baa5f384 from this chassis (sb_readonly=0)
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.277 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.453 187212 DEBUG nova.network.neutron [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Updating instance_info_cache with network_info: [{"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.816 187212 DEBUG oslo_concurrency.lockutils [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Releasing lock "refresh_cache-8fe1c6df-f787-4c56-b3e7-899cf5e9f723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.817 187212 DEBUG nova.compute.manager [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Instance network_info: |[{"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.817 187212 DEBUG oslo_concurrency.lockutils [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-8fe1c6df-f787-4c56-b3e7-899cf5e9f723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.817 187212 DEBUG nova.network.neutron [req-e0efea3b-8f8b-45a1-8477-0caca9b01d31 req-b89d1a34-681d-4bde-bcc9-6897bc3fff29 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Refreshing network info cache for port 8c343187-712d-4aee-9c47-18497ec1042e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.820 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] Start _get_guest_xml network_info=[{"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.826 187212 WARNING nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.832 187212 DEBUG nova.virt.libvirt.host [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.833 187212 DEBUG nova.virt.libvirt.host [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.837 187212 DEBUG nova.virt.libvirt.host [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.838 187212 DEBUG nova.virt.libvirt.host [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.838 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.838 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.839 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.839 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.839 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.839 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.840 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.840 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.840 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.840 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.841 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.841 187212 DEBUG nova.virt.hardware [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.845 187212 DEBUG nova.virt.libvirt.vif [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:10:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-2029374639',display_name='tempest-ServerDiskConfigTestJSON-server-2029374639',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-2029374639',id=77,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e836357870d746e49bc783da7cd3accd',ramdisk_id='',reservation_id='r-ep0a320q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1245488084',owner_user_name='tempest-ServerDiskConfigTestJSON-1245488084-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:10:25Z,user_data=None,user_id='ef254bb2df0442c6bcadfb3a6861c0e9',uuid=8fe1c6df-f787-4c56-b3e7-899cf5e9f723,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.846 187212 DEBUG nova.network.os_vif_util [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converting VIF {"id": "8c343187-712d-4aee-9c47-18497ec1042e", "address": "fa:16:3e:56:54:21", "network": {"id": "7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1513854546-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e836357870d746e49bc783da7cd3accd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c343187-71", "ovs_interfaceid": "8c343187-712d-4aee-9c47-18497ec1042e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.847 187212 DEBUG nova.network.os_vif_util [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:54:21,bridge_name='br-int',has_traffic_filtering=True,id=8c343187-712d-4aee-9c47-18497ec1042e,network=Network(7be4540a-0e0d-45dd-9ed3-2c2701ae3e2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c343187-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.847 187212 DEBUG nova.objects.instance [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] Lazy-loading 'pci_devices' on Instance uuid 8fe1c6df-f787-4c56-b3e7-899cf5e9f723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:10:36 np0005546909 nova_compute[187208]: 2025-12-05 12:10:36.865 187212 DEBUG nova.virt.libvirt.driver [None req-acf14d99-c584-4456-98c7-95b1f335e1cf ef254bb2df0442c6bcadfb3a6861c0e9 e836357870d746e49bc783da7cd3accd - - default default] [instance: 8fe1c6df-f787-4c56-b3e7-899cf5e9f723] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  <uuid>8fe1c6df-f787-4c56-b3e7-899cf5e9f723</uuid>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  <name>instance-0000004d</name>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-2029374639</nova:name>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:10:36</nova:creationTime>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:10:36 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:        <nova:user uuid="ef254bb2df0442c6bcadfb3a6861c0e9">tempest-ServerDiskConfigTestJSON-1245488084-project-member</nova:user>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:        <nova:project uuid="e836357870d746e49bc783da7cd3accd">tempest-ServerDiskConfigTestJSON-1245488084</nova:project>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:        <nova:port uuid="8c343187-712d-4aee-9c47-18497ec1042e">
Dec  5 07:10:36 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <entry name="serial">8fe1c6df-f787-4c56-b3e7-899cf5e9f723</entry>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <entry name="uuid">8fe1c6df-f787-4c56-b3e7-899cf5e9f723</entry>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/8fe1c6df-f787-4c56-b3e7-899cf5e9f723/disk"/>
Dec  5 07:10:36 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:15:03 np0005546909 nova_compute[187208]: 2025-12-05 12:15:03.713 187212 DEBUG nova.network.neutron [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:15:03 np0005546909 nova_compute[187208]: 2025-12-05 12:15:03.741 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Releasing lock "refresh_cache-108114b5-8832-494c-b436-40ffa2ffb7c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:15:03 np0005546909 nova_compute[187208]: 2025-12-05 12:15:03.742 187212 DEBUG nova.compute.manager [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:15:03 np0005546909 nova_compute[187208]: 2025-12-05 12:15:03.760 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:03 np0005546909 systemd[1]: machine-qemu\x2d118\x2dinstance\x2d00000061.scope: Deactivated successfully.
Dec  5 07:15:03 np0005546909 systemd[1]: machine-qemu\x2d118\x2dinstance\x2d00000061.scope: Consumed 5.648s CPU time.
Dec  5 07:15:03 np0005546909 systemd-machined[153543]: Machine qemu-118-instance-00000061 terminated.
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.004 187212 INFO nova.virt.libvirt.driver [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance destroyed successfully.#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.005 187212 DEBUG nova.objects.instance [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lazy-loading 'resources' on Instance uuid 108114b5-8832-494c-b436-40ffa2ffb7c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:04 np0005546909 rsyslogd[1004]: imjournal: 17233 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.018 187212 INFO nova.virt.libvirt.driver [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Deleting instance files /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1_del#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.019 187212 INFO nova.virt.libvirt.driver [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Deletion of /var/lib/nova/instances/108114b5-8832-494c-b436-40ffa2ffb7c1_del complete#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.084 187212 INFO nova.compute.manager [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.085 187212 DEBUG oslo.service.loopingcall [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.085 187212 DEBUG nova.compute.manager [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.085 187212 DEBUG nova.network.neutron [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.223 187212 DEBUG nova.network.neutron [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.273 187212 INFO nova.compute.manager [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Took 1.19 seconds to deallocate network for instance.#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.341 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.342 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.436 187212 DEBUG nova.compute.provider_tree [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.473 187212 DEBUG nova.scheduler.client.report [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.505 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.509 187212 DEBUG nova.network.neutron [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.530 187212 DEBUG nova.network.neutron [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.550 187212 INFO nova.compute.manager [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Took 0.46 seconds to deallocate network for instance.#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.560 187212 INFO nova.scheduler.client.report [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Deleted allocations for instance 28e48516-8665-4d98-a92d-c84b7da9a284#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.651 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.652 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.708 187212 DEBUG oslo_concurrency.lockutils [None req-821bcaea-f709-42df-8a76-855036fafbec e90fa3a379b4494c84626bb6a761cd30 c5b34686513f4abc8165113eb8c6831e - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.739 187212 DEBUG nova.compute.provider_tree [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.756 187212 DEBUG nova.scheduler.client.report [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.779 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.819 187212 INFO nova.scheduler.client.report [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Deleted allocations for instance 108114b5-8832-494c-b436-40ffa2ffb7c1#033[00m
Dec  5 07:15:04 np0005546909 nova_compute[187208]: 2025-12-05 12:15:04.877 187212 DEBUG oslo_concurrency.lockutils [None req-03021624-241e-4889-b0db-5b5ea424cdae 43dfbe2f6638492887b1176c979cc641 96143bdab6004f13b4ae4ed16efdbf16 - - default default] Lock "108114b5-8832-494c-b436-40ffa2ffb7c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.851s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:05 np0005546909 nova_compute[187208]: 2025-12-05 12:15:05.313 187212 DEBUG nova.compute.manager [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:15:05 np0005546909 nova_compute[187208]: 2025-12-05 12:15:05.313 187212 DEBUG oslo_concurrency.lockutils [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:05 np0005546909 nova_compute[187208]: 2025-12-05 12:15:05.314 187212 DEBUG oslo_concurrency.lockutils [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:05 np0005546909 nova_compute[187208]: 2025-12-05 12:15:05.314 187212 DEBUG oslo_concurrency.lockutils [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "28e48516-8665-4d98-a92d-c84b7da9a284-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:05 np0005546909 nova_compute[187208]: 2025-12-05 12:15:05.314 187212 DEBUG nova.compute.manager [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] No waiting events found dispatching network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:15:05 np0005546909 nova_compute[187208]: 2025-12-05 12:15:05.314 187212 WARNING nova.compute.manager [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received unexpected event network-vif-plugged-e30774db-d3d3-4438-b68a-6f7855f55128 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:15:05 np0005546909 nova_compute[187208]: 2025-12-05 12:15:05.314 187212 DEBUG nova.compute.manager [req-4388c7f3-33dd-450d-84ff-7969e6382819 req-4ad49218-2ba3-499d-934f-7c38c3de6a9a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Received event network-vif-deleted-e30774db-d3d3-4438-b68a-6f7855f55128 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:15:07 np0005546909 podman[239411]: 2025-12-05 12:15:07.22276866 +0000 UTC m=+0.069623794 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  5 07:15:07 np0005546909 podman[239412]: 2025-12-05 12:15:07.241054098 +0000 UTC m=+0.085462771 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:15:07 np0005546909 nova_compute[187208]: 2025-12-05 12:15:07.973 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:08 np0005546909 nova_compute[187208]: 2025-12-05 12:15:08.763 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:10 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:10Z|01049|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec  5 07:15:10 np0005546909 nova_compute[187208]: 2025-12-05 12:15:10.415 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:12 np0005546909 nova_compute[187208]: 2025-12-05 12:15:12.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.363 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.364 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.387 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.478 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.479 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.486 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.487 187212 INFO nova.compute.claims [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.642 187212 DEBUG nova.compute.provider_tree [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.765 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.801 187212 DEBUG nova.scheduler.client.report [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.823 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.824 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.865 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.884 187212 INFO nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:15:13 np0005546909 nova_compute[187208]: 2025-12-05 12:15:13.901 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.007 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.009 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.009 187212 INFO nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Creating image(s)#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.010 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.010 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.011 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.025 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.095 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.097 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.097 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.109 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.166 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.167 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.210 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.211 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.213 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.283 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.284 187212 DEBUG nova.virt.disk.api [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Checking if we can resize image /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.285 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.352 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.354 187212 DEBUG nova.virt.disk.api [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Cannot resize image /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.354 187212 DEBUG nova.objects.instance [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'migration_context' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.418 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.419 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Ensure instance console log exists: /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.419 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.420 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.420 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.422 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.426 187212 WARNING nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.433 187212 DEBUG nova.virt.libvirt.host [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.435 187212 DEBUG nova.virt.libvirt.host [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.439 187212 DEBUG nova.virt.libvirt.host [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.440 187212 DEBUG nova.virt.libvirt.host [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.444 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.444 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.445 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.445 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.445 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.445 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.445 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.446 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.446 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.446 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.446 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.447 187212 DEBUG nova.virt.hardware [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.453 187212 DEBUG nova.objects.instance [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'pci_devices' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.488 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  <uuid>f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5</uuid>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  <name>instance-00000062</name>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerShowV257Test-server-228959241</nova:name>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:15:14</nova:creationTime>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:15:14 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:        <nova:user uuid="3733ad965c154ae490947ad2a50e221d">tempest-ServerShowV257Test-1797821111-project-member</nova:user>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:        <nova:project uuid="239937ac98c24d5198788674713b75a1">tempest-ServerShowV257Test-1797821111</nova:project>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <entry name="serial">f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5</entry>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <entry name="uuid">f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5</entry>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/console.log" append="off"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:15:14 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:15:14 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:15:14 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:15:14 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.564 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.564 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:15:14 np0005546909 nova_compute[187208]: 2025-12-05 12:15:14.565 187212 INFO nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Using config drive#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.192 187212 INFO nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Creating config drive at /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.197 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ekbyhys execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.330 187212 DEBUG oslo_concurrency.processutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5ekbyhys" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:15 np0005546909 systemd-machined[153543]: New machine qemu-119-instance-00000062.
Dec  5 07:15:15 np0005546909 systemd[1]: Started Virtual Machine qemu-119-instance-00000062.
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.876 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936915.8756666, f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.876 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.879 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.879 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.882 187212 INFO nova.virt.libvirt.driver [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance spawned successfully.#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.882 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.898 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.904 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.908 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.908 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.909 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.909 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.910 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.910 187212 DEBUG nova.virt.libvirt.driver [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.937 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.937 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936915.8788347, f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.937 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] VM Started (Lifecycle Event)#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.965 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.968 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.981 187212 INFO nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Took 1.97 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:15:15 np0005546909 nova_compute[187208]: 2025-12-05 12:15:15.982 187212 DEBUG nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:16 np0005546909 nova_compute[187208]: 2025-12-05 12:15:16.008 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:15:16 np0005546909 nova_compute[187208]: 2025-12-05 12:15:16.043 187212 INFO nova.compute.manager [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Took 2.61 seconds to build instance.#033[00m
Dec  5 07:15:16 np0005546909 nova_compute[187208]: 2025-12-05 12:15:16.064 187212 DEBUG oslo_concurrency.lockutils [None req-187e28ad-be74-43b1-816f-e2e6ea7ec606 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.700s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:17 np0005546909 podman[239493]: 2025-12-05 12:15:17.20796282 +0000 UTC m=+0.055230228 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:15:17 np0005546909 podman[239492]: 2025-12-05 12:15:17.209581857 +0000 UTC m=+0.060814710 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  5 07:15:17 np0005546909 podman[239494]: 2025-12-05 12:15:17.246395771 +0000 UTC m=+0.090154838 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:15:17 np0005546909 nova_compute[187208]: 2025-12-05 12:15:17.944 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936902.9433396, 28e48516-8665-4d98-a92d-c84b7da9a284 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:17 np0005546909 nova_compute[187208]: 2025-12-05 12:15:17.944 187212 INFO nova.compute.manager [-] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:15:17 np0005546909 nova_compute[187208]: 2025-12-05 12:15:17.978 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:18 np0005546909 nova_compute[187208]: 2025-12-05 12:15:18.107 187212 DEBUG nova.compute.manager [None req-c8491b5f-0679-46e8-8e6d-66a723658f43 - - - - - -] [instance: 28e48516-8665-4d98-a92d-c84b7da9a284] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:18 np0005546909 nova_compute[187208]: 2025-12-05 12:15:18.767 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:19 np0005546909 nova_compute[187208]: 2025-12-05 12:15:19.002 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936904.0013607, 108114b5-8832-494c-b436-40ffa2ffb7c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:19 np0005546909 nova_compute[187208]: 2025-12-05 12:15:19.002 187212 INFO nova.compute.manager [-] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:15:19 np0005546909 nova_compute[187208]: 2025-12-05 12:15:19.381 187212 DEBUG nova.compute.manager [None req-43d29690-47e5-4748-93bb-24bf9571babf - - - - - -] [instance: 108114b5-8832-494c-b436-40ffa2ffb7c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:20 np0005546909 nova_compute[187208]: 2025-12-05 12:15:20.061 187212 INFO nova.compute.manager [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Rebuilding instance#033[00m
Dec  5 07:15:20 np0005546909 nova_compute[187208]: 2025-12-05 12:15:20.778 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:20 np0005546909 nova_compute[187208]: 2025-12-05 12:15:20.792 187212 DEBUG nova.compute.manager [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:20 np0005546909 nova_compute[187208]: 2025-12-05 12:15:20.890 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'pci_requests' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:20 np0005546909 nova_compute[187208]: 2025-12-05 12:15:20.904 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'pci_devices' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:20 np0005546909 nova_compute[187208]: 2025-12-05 12:15:20.926 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'resources' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:20 np0005546909 nova_compute[187208]: 2025-12-05 12:15:20.941 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'migration_context' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:20 np0005546909 nova_compute[187208]: 2025-12-05 12:15:20.964 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:15:20 np0005546909 nova_compute[187208]: 2025-12-05 12:15:20.968 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:15:22 np0005546909 nova_compute[187208]: 2025-12-05 12:15:22.989 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:23 np0005546909 nova_compute[187208]: 2025-12-05 12:15:23.770 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.359 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.360 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.391 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.464 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.465 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.473 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.473 187212 INFO nova.compute.claims [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.629 187212 DEBUG nova.compute.provider_tree [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.651 187212 DEBUG nova.scheduler.client.report [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.678 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.679 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.750 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.751 187212 DEBUG nova.network.neutron [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.787 187212 INFO nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.804 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.895 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.897 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.902 187212 INFO nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Creating image(s)#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.903 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.903 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.904 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.916 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.983 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.985 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:25 np0005546909 nova_compute[187208]: 2025-12-05 12:15:25.986 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.000 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.076 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.078 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.131 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.133 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.133 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.196 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.199 187212 DEBUG nova.virt.disk.api [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Checking if we can resize image /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.200 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.270 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.272 187212 DEBUG nova.virt.disk.api [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Cannot resize image /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.272 187212 DEBUG nova.objects.instance [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'migration_context' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.286 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.288 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Ensure instance console log exists: /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.288 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.289 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.289 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:26 np0005546909 nova_compute[187208]: 2025-12-05 12:15:26.394 187212 DEBUG nova.policy [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '75752a4cc8f7487e8dc4440201f894c8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:15:27 np0005546909 nova_compute[187208]: 2025-12-05 12:15:27.993 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:28 np0005546909 nova_compute[187208]: 2025-12-05 12:15:28.162 187212 DEBUG nova.network.neutron [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Successfully created port: b5a9a5df-a95c-46bb-b043-0ff6ae79599e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:15:28 np0005546909 podman[239582]: 2025-12-05 12:15:28.195251592 +0000 UTC m=+0.048692249 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:15:28 np0005546909 nova_compute[187208]: 2025-12-05 12:15:28.771 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:30 np0005546909 nova_compute[187208]: 2025-12-05 12:15:30.764 187212 DEBUG nova.network.neutron [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Successfully updated port: b5a9a5df-a95c-46bb-b043-0ff6ae79599e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:15:30 np0005546909 nova_compute[187208]: 2025-12-05 12:15:30.788 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "refresh_cache-b9ba9fad-eaef-4c3b-9793-23053fe1ace1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:15:30 np0005546909 nova_compute[187208]: 2025-12-05 12:15:30.789 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquired lock "refresh_cache-b9ba9fad-eaef-4c3b-9793-23053fe1ace1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:15:30 np0005546909 nova_compute[187208]: 2025-12-05 12:15:30.789 187212 DEBUG nova.network.neutron [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:15:31 np0005546909 nova_compute[187208]: 2025-12-05 12:15:31.051 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Dec  5 07:15:31 np0005546909 nova_compute[187208]: 2025-12-05 12:15:31.122 187212 DEBUG nova.compute.manager [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-changed-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:15:31 np0005546909 nova_compute[187208]: 2025-12-05 12:15:31.123 187212 DEBUG nova.compute.manager [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Refreshing instance network info cache due to event network-changed-b5a9a5df-a95c-46bb-b043-0ff6ae79599e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:15:31 np0005546909 nova_compute[187208]: 2025-12-05 12:15:31.123 187212 DEBUG oslo_concurrency.lockutils [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-b9ba9fad-eaef-4c3b-9793-23053fe1ace1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:15:31 np0005546909 nova_compute[187208]: 2025-12-05 12:15:31.236 187212 DEBUG nova.network.neutron [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.916 187212 DEBUG nova.network.neutron [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Updating instance_info_cache with network_info: [{"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.938 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Releasing lock "refresh_cache-b9ba9fad-eaef-4c3b-9793-23053fe1ace1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.938 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance network_info: |[{"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.939 187212 DEBUG oslo_concurrency.lockutils [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-b9ba9fad-eaef-4c3b-9793-23053fe1ace1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.939 187212 DEBUG nova.network.neutron [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Refreshing network info cache for port b5a9a5df-a95c-46bb-b043-0ff6ae79599e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.942 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Start _get_guest_xml network_info=[{"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.946 187212 WARNING nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.950 187212 DEBUG nova.virt.libvirt.host [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.952 187212 DEBUG nova.virt.libvirt.host [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.957 187212 DEBUG nova.virt.libvirt.host [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.958 187212 DEBUG nova.virt.libvirt.host [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.958 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.959 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.959 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.960 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.960 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.960 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.960 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.961 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.961 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.961 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.961 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.962 187212 DEBUG nova.virt.hardware [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.968 187212 DEBUG nova.virt.libvirt.vif [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-58863967',display_name='tempest-tempest.common.compute-instance-58863967',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-58863967',id=99,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-irsjw7eb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:15:25Z,user_data=None,user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=b9ba9fad-eaef-4c3b-9793-23053fe1ace1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.969 187212 DEBUG nova.network.os_vif_util [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.970 187212 DEBUG nova.network.os_vif_util [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.972 187212 DEBUG nova.objects.instance [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.989 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  <uuid>b9ba9fad-eaef-4c3b-9793-23053fe1ace1</uuid>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  <name>instance-00000063</name>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <nova:name>tempest-tempest.common.compute-instance-58863967</nova:name>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:15:32</nova:creationTime>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:        <nova:user uuid="41799f35c2764b25912247e2e8e2e9c5">tempest-ServerActionsTestJSON-1748869140-project-member</nova:user>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:        <nova:project uuid="75752a4cc8f7487e8dc4440201f894c8">tempest-ServerActionsTestJSON-1748869140</nova:project>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:        <nova:port uuid="b5a9a5df-a95c-46bb-b043-0ff6ae79599e">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <entry name="serial">b9ba9fad-eaef-4c3b-9793-23053fe1ace1</entry>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <entry name="uuid">b9ba9fad-eaef-4c3b-9793-23053fe1ace1</entry>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:6b:1e:ff"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <target dev="tapb5a9a5df-a9"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/console.log" append="off"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:15:32 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:15:32 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:15:32 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:15:32 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.990 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Preparing to wait for external event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.991 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.991 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.991 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.992 187212 DEBUG nova.virt.libvirt.vif [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-58863967',display_name='tempest-tempest.common.compute-instance-58863967',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-58863967',id=99,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-irsjw7eb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:15:25Z,user_data=None,user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=b9ba9fad-eaef-4c3b-9793-23053fe1ace1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.993 187212 DEBUG nova.network.os_vif_util [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.993 187212 DEBUG nova.network.os_vif_util [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.994 187212 DEBUG os_vif [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.994 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.995 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.995 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:15:32 np0005546909 nova_compute[187208]: 2025-12-05 12:15:32.997 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.003 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5a9a5df-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.003 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5a9a5df-a9, col_values=(('external_ids', {'iface-id': 'b5a9a5df-a95c-46bb-b043-0ff6ae79599e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:1e:ff', 'vm-uuid': 'b9ba9fad-eaef-4c3b-9793-23053fe1ace1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.004 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:33 np0005546909 NetworkManager[55691]: <info>  [1764936933.0059] manager: (tapb5a9a5df-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/401)
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.008 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.012 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.014 187212 INFO os_vif [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9')#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.078 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.079 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.079 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No VIF found with MAC fa:16:3e:6b:1e:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.080 187212 INFO nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Using config drive#033[00m
Dec  5 07:15:33 np0005546909 podman[239614]: 2025-12-05 12:15:33.139394976 +0000 UTC m=+0.087705477 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:15:33 np0005546909 systemd[1]: machine-qemu\x2d119\x2dinstance\x2d00000062.scope: Deactivated successfully.
Dec  5 07:15:33 np0005546909 systemd[1]: machine-qemu\x2d119\x2dinstance\x2d00000062.scope: Consumed 12.629s CPU time.
Dec  5 07:15:33 np0005546909 systemd-machined[153543]: Machine qemu-119-instance-00000062 terminated.
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.684 187212 INFO nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Creating config drive at /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.690 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfl1mp6a_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.774 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.835 187212 DEBUG oslo_concurrency.processutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfl1mp6a_" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:33 np0005546909 kernel: tapb5a9a5df-a9: entered promiscuous mode
Dec  5 07:15:33 np0005546909 NetworkManager[55691]: <info>  [1764936933.9316] manager: (tapb5a9a5df-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/402)
Dec  5 07:15:33 np0005546909 systemd-udevd[239634]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:15:33 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:33Z|01050|binding|INFO|Claiming lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e for this chassis.
Dec  5 07:15:33 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:33Z|01051|binding|INFO|b5a9a5df-a95c-46bb-b043-0ff6ae79599e: Claiming fa:16:3e:6b:1e:ff 10.100.0.8
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.932 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:33 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:33.941 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:1e:ff 10.100.0.8'], port_security=['fa:16:3e:6b:1e:ff 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b9ba9fad-eaef-4c3b-9793-23053fe1ace1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '10cc7c6d-475c-43b4-8ab3-df3294aff9b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b5a9a5df-a95c-46bb-b043-0ff6ae79599e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:15:33 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:33.944 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b5a9a5df-a95c-46bb-b043-0ff6ae79599e in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 bound to our chassis#033[00m
Dec  5 07:15:33 np0005546909 NetworkManager[55691]: <info>  [1764936933.9468] device (tapb5a9a5df-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:15:33 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:33.946 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85#033[00m
Dec  5 07:15:33 np0005546909 NetworkManager[55691]: <info>  [1764936933.9480] device (tapb5a9a5df-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:15:33 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:33Z|01052|binding|INFO|Setting lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e ovn-installed in OVS
Dec  5 07:15:33 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:33Z|01053|binding|INFO|Setting lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e up in Southbound
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.948 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.950 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:33 np0005546909 nova_compute[187208]: 2025-12-05 12:15:33.954 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:33 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:33.971 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ceb78b0-d2b2-4059-902e-275b704344d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:33 np0005546909 systemd-machined[153543]: New machine qemu-120-instance-00000063.
Dec  5 07:15:34 np0005546909 systemd[1]: Started Virtual Machine qemu-120-instance-00000063.
Dec  5 07:15:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.012 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[fd422812-7b70-497d-8907-c3d1ce3ac436]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.018 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c962baa6-8cf4-425a-85be-5bfd7b9ab9bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.052 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6c5909f0-4637-4f56-b8a9-b37ac9cfc6c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.070 187212 INFO nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance shutdown successfully after 13 seconds.#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.077 187212 INFO nova.virt.libvirt.driver [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance destroyed successfully.#033[00m
Dec  5 07:15:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.079 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b274f1-c12d-4221-bd3b-cff342916a79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426014, 'reachable_time': 23848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239671, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.085 187212 INFO nova.virt.libvirt.driver [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance destroyed successfully.#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.087 187212 INFO nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Deleting instance files /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5_del#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.088 187212 INFO nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Deletion of /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5_del complete#033[00m
Dec  5 07:15:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.098 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[75cde256-330c-4d1e-b277-35d8c18d7406]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426028, 'tstamp': 426028}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239674, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426031, 'tstamp': 426031}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239674, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.100 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.102 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.103 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.104 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:15:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.104 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:34 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:34.105 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.297 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.298 187212 INFO nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Creating image(s)#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.299 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.299 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.300 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.319 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.383 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.384 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.385 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.396 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.456 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.457 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.568 187212 DEBUG nova.compute.manager [req-0a61a32e-4197-462c-b054-1e391383beaa req-97f1d99d-01a6-453e-b2a2-13add504bcd2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.569 187212 DEBUG oslo_concurrency.lockutils [req-0a61a32e-4197-462c-b054-1e391383beaa req-97f1d99d-01a6-453e-b2a2-13add504bcd2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.571 187212 DEBUG oslo_concurrency.lockutils [req-0a61a32e-4197-462c-b054-1e391383beaa req-97f1d99d-01a6-453e-b2a2-13add504bcd2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.571 187212 DEBUG oslo_concurrency.lockutils [req-0a61a32e-4197-462c-b054-1e391383beaa req-97f1d99d-01a6-453e-b2a2-13add504bcd2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.571 187212 DEBUG nova.compute.manager [req-0a61a32e-4197-462c-b054-1e391383beaa req-97f1d99d-01a6-453e-b2a2-13add504bcd2 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Processing event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.584 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936934.5837123, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.585 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Started (Lifecycle Event)#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.587 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.592 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.596 187212 INFO nova.virt.libvirt.driver [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance spawned successfully.#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.597 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.643 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.651 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.652 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.653 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.654 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.654 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.655 187212 DEBUG nova.virt.libvirt.driver [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.661 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.710 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.711 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936934.5847793, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.711 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.739 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.742 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936934.591753, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.742 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.766 187212 INFO nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Took 8.87 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.766 187212 DEBUG nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.768 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.778 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.811 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.863 187212 INFO nova.compute.manager [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Took 9.42 seconds to build instance.#033[00m
Dec  5 07:15:34 np0005546909 nova_compute[187208]: 2025-12-05 12:15:34.884 187212 DEBUG oslo_concurrency.lockutils [None req-3f4881c6-e5c1-4b93-b20c-f0ecb44e160d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.097 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk 1073741824" returned: 0 in 0.640s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.098 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.714s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.099 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.161 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.163 187212 DEBUG nova.virt.disk.api [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Checking if we can resize image /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.163 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.236 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.238 187212 DEBUG nova.virt.disk.api [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Cannot resize image /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.239 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.240 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Ensure instance console log exists: /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.242 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.243 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.243 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.246 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.253 187212 WARNING nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.262 187212 DEBUG nova.virt.libvirt.host [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.264 187212 DEBUG nova.virt.libvirt.host [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.267 187212 DEBUG nova.virt.libvirt.host [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.268 187212 DEBUG nova.virt.libvirt.host [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.269 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.269 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.270 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.270 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.271 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.271 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.272 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.272 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.272 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.274 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.275 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.275 187212 DEBUG nova.virt.hardware [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.276 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.313 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  <uuid>f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5</uuid>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  <name>instance-00000062</name>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerShowV257Test-server-228959241</nova:name>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:15:35</nova:creationTime>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:15:35 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:        <nova:user uuid="3733ad965c154ae490947ad2a50e221d">tempest-ServerShowV257Test-1797821111-project-member</nova:user>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:        <nova:project uuid="239937ac98c24d5198788674713b75a1">tempest-ServerShowV257Test-1797821111</nova:project>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <entry name="serial">f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5</entry>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <entry name="uuid">f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5</entry>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/console.log" append="off"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:15:35 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:15:35 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:15:35 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:15:35 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.382 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.383 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.384 187212 INFO nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Using config drive#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.406 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'ec2_ids' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.502 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'keypairs' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.977 187212 INFO nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Creating config drive at /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config#033[00m
Dec  5 07:15:35 np0005546909 nova_compute[187208]: 2025-12-05 12:15:35.984 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvxmdz1zq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:36 np0005546909 nova_compute[187208]: 2025-12-05 12:15:36.038 187212 DEBUG nova.network.neutron [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Updated VIF entry in instance network info cache for port b5a9a5df-a95c-46bb-b043-0ff6ae79599e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:15:36 np0005546909 nova_compute[187208]: 2025-12-05 12:15:36.039 187212 DEBUG nova.network.neutron [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Updating instance_info_cache with network_info: [{"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:15:36 np0005546909 nova_compute[187208]: 2025-12-05 12:15:36.115 187212 DEBUG oslo_concurrency.processutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvxmdz1zq" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:36 np0005546909 nova_compute[187208]: 2025-12-05 12:15:36.167 187212 DEBUG oslo_concurrency.lockutils [req-2f934a85-856a-4556-84e5-0a82e634bfc7 req-99a3e1f1-7fb4-41db-bed5-0b6070c2f851 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-b9ba9fad-eaef-4c3b-9793-23053fe1ace1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:15:36 np0005546909 systemd-machined[153543]: New machine qemu-121-instance-00000062.
Dec  5 07:15:36 np0005546909 systemd[1]: Started Virtual Machine qemu-121-instance-00000062.
Dec  5 07:15:36 np0005546909 nova_compute[187208]: 2025-12-05 12:15:36.793 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:15:36 np0005546909 nova_compute[187208]: 2025-12-05 12:15:36.793 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936936.7925968, f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:36 np0005546909 nova_compute[187208]: 2025-12-05 12:15:36.794 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:15:36 np0005546909 nova_compute[187208]: 2025-12-05 12:15:36.796 187212 DEBUG nova.compute.manager [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:15:36 np0005546909 nova_compute[187208]: 2025-12-05 12:15:36.796 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:15:36 np0005546909 nova_compute[187208]: 2025-12-05 12:15:36.800 187212 INFO nova.virt.libvirt.driver [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance spawned successfully.#033[00m
Dec  5 07:15:36 np0005546909 nova_compute[187208]: 2025-12-05 12:15:36.800 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.032 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.037 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.038 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.039 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.039 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.040 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.040 187212 DEBUG nova.virt.libvirt.driver [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.045 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.087 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.088 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936936.793265, f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.088 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] VM Started (Lifecycle Event)#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.111 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.114 187212 DEBUG nova.compute.manager [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.117 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.149 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.180 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.181 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.181 187212 DEBUG nova.objects.instance [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.309 187212 DEBUG oslo_concurrency.lockutils [None req-1cf69014-5da6-4114-b75b-59617e25fc0c 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.373 187212 DEBUG nova.compute.manager [req-43bcf8c4-ec70-4cd6-b386-5826d2ce44cb req-24fe14d8-0388-4e73-a3a7-2e9e0eec0c8d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.373 187212 DEBUG oslo_concurrency.lockutils [req-43bcf8c4-ec70-4cd6-b386-5826d2ce44cb req-24fe14d8-0388-4e73-a3a7-2e9e0eec0c8d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.374 187212 DEBUG oslo_concurrency.lockutils [req-43bcf8c4-ec70-4cd6-b386-5826d2ce44cb req-24fe14d8-0388-4e73-a3a7-2e9e0eec0c8d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.374 187212 DEBUG oslo_concurrency.lockutils [req-43bcf8c4-ec70-4cd6-b386-5826d2ce44cb req-24fe14d8-0388-4e73-a3a7-2e9e0eec0c8d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.374 187212 DEBUG nova.compute.manager [req-43bcf8c4-ec70-4cd6-b386-5826d2ce44cb req-24fe14d8-0388-4e73-a3a7-2e9e0eec0c8d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] No waiting events found dispatching network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:15:37 np0005546909 nova_compute[187208]: 2025-12-05 12:15:37.375 187212 WARNING nova.compute.manager [req-43bcf8c4-ec70-4cd6-b386-5826d2ce44cb req-24fe14d8-0388-4e73-a3a7-2e9e0eec0c8d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received unexpected event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e for instance with vm_state active and task_state None.#033[00m
Dec  5 07:15:38 np0005546909 nova_compute[187208]: 2025-12-05 12:15:38.007 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:38 np0005546909 podman[239726]: 2025-12-05 12:15:38.235143793 +0000 UTC m=+0.073252879 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc.)
Dec  5 07:15:38 np0005546909 podman[239727]: 2025-12-05 12:15:38.255075439 +0000 UTC m=+0.093730091 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Dec  5 07:15:38 np0005546909 nova_compute[187208]: 2025-12-05 12:15:38.776 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:39 np0005546909 nova_compute[187208]: 2025-12-05 12:15:39.478 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:39 np0005546909 nova_compute[187208]: 2025-12-05 12:15:39.479 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:39 np0005546909 nova_compute[187208]: 2025-12-05 12:15:39.479 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:39 np0005546909 nova_compute[187208]: 2025-12-05 12:15:39.480 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:39 np0005546909 nova_compute[187208]: 2025-12-05 12:15:39.480 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:39 np0005546909 nova_compute[187208]: 2025-12-05 12:15:39.484 187212 INFO nova.compute.manager [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Terminating instance#033[00m
Dec  5 07:15:39 np0005546909 nova_compute[187208]: 2025-12-05 12:15:39.485 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "refresh_cache-f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:15:39 np0005546909 nova_compute[187208]: 2025-12-05 12:15:39.485 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquired lock "refresh_cache-f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:15:39 np0005546909 nova_compute[187208]: 2025-12-05 12:15:39.486 187212 DEBUG nova.network.neutron [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:15:39 np0005546909 nova_compute[187208]: 2025-12-05 12:15:39.975 187212 DEBUG nova.network.neutron [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:15:40 np0005546909 nova_compute[187208]: 2025-12-05 12:15:40.556 187212 INFO nova.compute.manager [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Rebuilding instance#033[00m
Dec  5 07:15:40 np0005546909 nova_compute[187208]: 2025-12-05 12:15:40.890 187212 DEBUG nova.network.neutron [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:15:40 np0005546909 nova_compute[187208]: 2025-12-05 12:15:40.918 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Releasing lock "refresh_cache-f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:15:40 np0005546909 nova_compute[187208]: 2025-12-05 12:15:40.918 187212 DEBUG nova.compute.manager [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:15:40 np0005546909 systemd[1]: machine-qemu\x2d121\x2dinstance\x2d00000062.scope: Deactivated successfully.
Dec  5 07:15:40 np0005546909 systemd[1]: machine-qemu\x2d121\x2dinstance\x2d00000062.scope: Consumed 4.728s CPU time.
Dec  5 07:15:40 np0005546909 systemd-machined[153543]: Machine qemu-121-instance-00000062 terminated.
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.191 187212 INFO nova.virt.libvirt.driver [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance destroyed successfully.#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.192 187212 DEBUG nova.objects.instance [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lazy-loading 'resources' on Instance uuid f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.207 187212 INFO nova.virt.libvirt.driver [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Deleting instance files /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5_del#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.208 187212 INFO nova.virt.libvirt.driver [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Deletion of /var/lib/nova/instances/f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5_del complete#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.260 187212 INFO nova.compute.manager [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.261 187212 DEBUG oslo.service.loopingcall [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.261 187212 DEBUG nova.compute.manager [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.261 187212 DEBUG nova.network.neutron [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.538 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.557 187212 DEBUG nova.compute.manager [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.610 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_requests' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.630 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.645 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'resources' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.659 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'migration_context' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.666 187212 DEBUG nova.network.neutron [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.675 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.677 187212 DEBUG nova.network.neutron [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.680 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.689 187212 INFO nova.compute.manager [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Took 0.43 seconds to deallocate network for instance.#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.734 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.734 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.848 187212 DEBUG nova.compute.provider_tree [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.873 187212 DEBUG nova.scheduler.client.report [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.910 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:41 np0005546909 nova_compute[187208]: 2025-12-05 12:15:41.940 187212 INFO nova.scheduler.client.report [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Deleted allocations for instance f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5#033[00m
Dec  5 07:15:42 np0005546909 nova_compute[187208]: 2025-12-05 12:15:42.023 187212 DEBUG oslo_concurrency.lockutils [None req-19cbf0ec-597f-4302-9540-ec7f67ec1a3e 3733ad965c154ae490947ad2a50e221d 239937ac98c24d5198788674713b75a1 - - default default] Lock "f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.544s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:43 np0005546909 nova_compute[187208]: 2025-12-05 12:15:43.010 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:43 np0005546909 nova_compute[187208]: 2025-12-05 12:15:43.778 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:46 np0005546909 nova_compute[187208]: 2025-12-05 12:15:46.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:15:46 np0005546909 nova_compute[187208]: 2025-12-05 12:15:46.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:15:46 np0005546909 nova_compute[187208]: 2025-12-05 12:15:46.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:15:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:47Z|00121|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6b:1e:ff 10.100.0.8
Dec  5 07:15:47 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:47Z|00122|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6b:1e:ff 10.100.0.8
Dec  5 07:15:47 np0005546909 nova_compute[187208]: 2025-12-05 12:15:47.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:15:47 np0005546909 nova_compute[187208]: 2025-12-05 12:15:47.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:15:47 np0005546909 nova_compute[187208]: 2025-12-05 12:15:47.063 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:15:47 np0005546909 nova_compute[187208]: 2025-12-05 12:15:47.779 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:15:47 np0005546909 nova_compute[187208]: 2025-12-05 12:15:47.780 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:15:47 np0005546909 nova_compute[187208]: 2025-12-05 12:15:47.780 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:15:47 np0005546909 nova_compute[187208]: 2025-12-05 12:15:47.781 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:48 np0005546909 nova_compute[187208]: 2025-12-05 12:15:48.015 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:48 np0005546909 podman[239792]: 2025-12-05 12:15:48.21856241 +0000 UTC m=+0.058195944 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:15:48 np0005546909 podman[239791]: 2025-12-05 12:15:48.226097718 +0000 UTC m=+0.070583572 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd)
Dec  5 07:15:48 np0005546909 podman[239793]: 2025-12-05 12:15:48.261936384 +0000 UTC m=+0.091493796 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  5 07:15:48 np0005546909 nova_compute[187208]: 2025-12-05 12:15:48.780 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:51 np0005546909 nova_compute[187208]: 2025-12-05 12:15:51.731 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Dec  5 07:15:53 np0005546909 nova_compute[187208]: 2025-12-05 12:15:53.017 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:53 np0005546909 nova_compute[187208]: 2025-12-05 12:15:53.672 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:15:53 np0005546909 nova_compute[187208]: 2025-12-05 12:15:53.695 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:15:53 np0005546909 nova_compute[187208]: 2025-12-05 12:15:53.696 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:15:53 np0005546909 nova_compute[187208]: 2025-12-05 12:15:53.696 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:15:53 np0005546909 nova_compute[187208]: 2025-12-05 12:15:53.696 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:15:53 np0005546909 nova_compute[187208]: 2025-12-05 12:15:53.697 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:15:53 np0005546909 nova_compute[187208]: 2025-12-05 12:15:53.697 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:15:53 np0005546909 nova_compute[187208]: 2025-12-05 12:15:53.820 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:53 np0005546909 kernel: tapb5a9a5df-a9 (unregistering): left promiscuous mode
Dec  5 07:15:53 np0005546909 NetworkManager[55691]: <info>  [1764936953.9528] device (tapb5a9a5df-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:15:53 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:53Z|01054|binding|INFO|Releasing lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e from this chassis (sb_readonly=0)
Dec  5 07:15:53 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:53Z|01055|binding|INFO|Setting lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e down in Southbound
Dec  5 07:15:53 np0005546909 nova_compute[187208]: 2025-12-05 12:15:53.962 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:53 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:53Z|01056|binding|INFO|Removing iface tapb5a9a5df-a9 ovn-installed in OVS
Dec  5 07:15:53 np0005546909 nova_compute[187208]: 2025-12-05 12:15:53.964 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:53 np0005546909 nova_compute[187208]: 2025-12-05 12:15:53.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:53.979 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:1e:ff 10.100.0.8'], port_security=['fa:16:3e:6b:1e:ff 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b9ba9fad-eaef-4c3b-9793-23053fe1ace1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '10cc7c6d-475c-43b4-8ab3-df3294aff9b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b5a9a5df-a95c-46bb-b043-0ff6ae79599e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:15:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:53.980 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b5a9a5df-a95c-46bb-b043-0ff6ae79599e in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis#033[00m
Dec  5 07:15:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:53.983 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85#033[00m
Dec  5 07:15:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.001 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2b8b34-9839-4b13-8a05-f325ca71bfcb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:54 np0005546909 systemd[1]: machine-qemu\x2d120\x2dinstance\x2d00000063.scope: Deactivated successfully.
Dec  5 07:15:54 np0005546909 systemd[1]: machine-qemu\x2d120\x2dinstance\x2d00000063.scope: Consumed 12.567s CPU time.
Dec  5 07:15:54 np0005546909 systemd-machined[153543]: Machine qemu-120-instance-00000063 terminated.
Dec  5 07:15:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.035 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[c67f7e42-6acd-4dbc-bce1-0e90a524de48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.039 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3f7fcb0a-7ad6-4b5c-a359-01dbf848ff3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:15:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.066 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[6d24b2a3-7940-42a7-b65a-2d28e9d91350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.087 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[925cb1c3-1299-4f12-887e-a778566ba68c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 616, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426014, 'reachable_time': 23848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239882, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.090 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:15:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.102 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f3a2feea-1198-49a8-95b5-f4511eead67b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426028, 'tstamp': 426028}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239883, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426031, 'tstamp': 426031}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239883, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.104 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.105 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.110 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.110 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.111 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:15:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.111 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:54.111 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.126 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.126 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.127 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.127 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.203 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.208 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.317 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.378 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.379 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.447 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.453 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.482 187212 DEBUG nova.compute.manager [req-0b224c67-cf24-4156-b3b0-26a6e23986f7 req-90eb165a-64f8-4013-8773-a6663cf7f140 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-unplugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.482 187212 DEBUG oslo_concurrency.lockutils [req-0b224c67-cf24-4156-b3b0-26a6e23986f7 req-90eb165a-64f8-4013-8773-a6663cf7f140 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.482 187212 DEBUG oslo_concurrency.lockutils [req-0b224c67-cf24-4156-b3b0-26a6e23986f7 req-90eb165a-64f8-4013-8773-a6663cf7f140 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.483 187212 DEBUG oslo_concurrency.lockutils [req-0b224c67-cf24-4156-b3b0-26a6e23986f7 req-90eb165a-64f8-4013-8773-a6663cf7f140 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.483 187212 DEBUG nova.compute.manager [req-0b224c67-cf24-4156-b3b0-26a6e23986f7 req-90eb165a-64f8-4013-8773-a6663cf7f140 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] No waiting events found dispatching network-vif-unplugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.483 187212 WARNING nova.compute.manager [req-0b224c67-cf24-4156-b3b0-26a6e23986f7 req-90eb165a-64f8-4013-8773-a6663cf7f140 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received unexpected event network-vif-unplugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e for instance with vm_state active and task_state rebuilding.#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.524 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.525 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.590 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.745 187212 INFO nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance shutdown successfully after 13 seconds.#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.752 187212 INFO nova.virt.libvirt.driver [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance destroyed successfully.#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.761 187212 INFO nova.virt.libvirt.driver [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance destroyed successfully.#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.762 187212 DEBUG nova.virt.libvirt.vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-58863967',display_name='tempest-ServerActionsTestJSON-server-975018653',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-58863967',id=99,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:15:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-irsjw7eb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:15:39Z,user_data=None,user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=b9ba9fad-eaef-4c3b-9793-23053fe1ace1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.763 187212 DEBUG nova.network.os_vif_util [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.764 187212 DEBUG nova.network.os_vif_util [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.765 187212 DEBUG os_vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.769 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.770 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5a9a5df-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.771 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.775 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.780 187212 INFO os_vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9')#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.780 187212 INFO nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Deleting instance files /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1_del#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.781 187212 INFO nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Deletion of /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1_del complete#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.807 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.808 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5412MB free_disk=72.98209381103516GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.808 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.809 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.985 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.985 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance b9ba9fad-eaef-4c3b-9793-23053fe1ace1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.986 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:15:54 np0005546909 nova_compute[187208]: 2025-12-05 12:15:54.986 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.058 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.059 187212 INFO nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Creating image(s)#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.060 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.060 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.060 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.080 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.111 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.146 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.151 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.152 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.152 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.164 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.184 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.185 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.220 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.220 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.302 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk 1073741824" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.304 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.305 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:55.353 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.353 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:55.354 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.376 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.377 187212 DEBUG nova.virt.disk.api [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Checking if we can resize image /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.378 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.447 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.449 187212 DEBUG nova.virt.disk.api [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Cannot resize image /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.449 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.449 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Ensure instance console log exists: /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.450 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.450 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.451 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.453 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Start _get_guest_xml network_info=[{"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.457 187212 WARNING nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.462 187212 DEBUG nova.virt.libvirt.host [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.463 187212 DEBUG nova.virt.libvirt.host [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.466 187212 DEBUG nova.virt.libvirt.host [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.467 187212 DEBUG nova.virt.libvirt.host [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.468 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.468 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.468 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.468 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.469 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.469 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.469 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.469 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.469 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.470 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.470 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.470 187212 DEBUG nova.virt.hardware [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.470 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.499 187212 DEBUG nova.virt.libvirt.vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-58863967',display_name='tempest-ServerActionsTestJSON-server-975018653',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-58863967',id=99,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:15:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-irsjw7eb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:15:54Z,user_data=None,user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=b9ba9fad-eaef-4c3b-9793-23053fe1ace1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.500 187212 DEBUG nova.network.os_vif_util [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.501 187212 DEBUG nova.network.os_vif_util [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.503 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  <uuid>b9ba9fad-eaef-4c3b-9793-23053fe1ace1</uuid>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  <name>instance-00000063</name>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerActionsTestJSON-server-975018653</nova:name>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:15:55</nova:creationTime>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:        <nova:user uuid="41799f35c2764b25912247e2e8e2e9c5">tempest-ServerActionsTestJSON-1748869140-project-member</nova:user>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:        <nova:project uuid="75752a4cc8f7487e8dc4440201f894c8">tempest-ServerActionsTestJSON-1748869140</nova:project>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:        <nova:port uuid="b5a9a5df-a95c-46bb-b043-0ff6ae79599e">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <entry name="serial">b9ba9fad-eaef-4c3b-9793-23053fe1ace1</entry>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <entry name="uuid">b9ba9fad-eaef-4c3b-9793-23053fe1ace1</entry>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:6b:1e:ff"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <target dev="tapb5a9a5df-a9"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/console.log" append="off"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:15:55 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:15:55 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:15:55 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:15:55 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.503 187212 DEBUG nova.compute.manager [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Preparing to wait for external event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.503 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.504 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.504 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.504 187212 DEBUG nova.virt.libvirt.vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-58863967',display_name='tempest-ServerActionsTestJSON-server-975018653',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-58863967',id=99,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:15:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-irsjw7eb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:15:54Z,user_data=None,user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=b9ba9fad-eaef-4c3b-9793-23053fe1ace1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.505 187212 DEBUG nova.network.os_vif_util [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.505 187212 DEBUG nova.network.os_vif_util [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.505 187212 DEBUG os_vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.506 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.506 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.507 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.510 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.510 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5a9a5df-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.511 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5a9a5df-a9, col_values=(('external_ids', {'iface-id': 'b5a9a5df-a95c-46bb-b043-0ff6ae79599e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:1e:ff', 'vm-uuid': 'b9ba9fad-eaef-4c3b-9793-23053fe1ace1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.512 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:55 np0005546909 NetworkManager[55691]: <info>  [1764936955.5136] manager: (tapb5a9a5df-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/403)
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.515 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.519 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.521 187212 INFO os_vif [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9')#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.566 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.566 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.566 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] No VIF found with MAC fa:16:3e:6b:1e:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.567 187212 INFO nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Using config drive#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.582 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'ec2_ids' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:55 np0005546909 nova_compute[187208]: 2025-12-05 12:15:55.628 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'keypairs' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.188 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936941.1868029, f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.189 187212 INFO nova.compute.manager [-] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.211 187212 DEBUG nova.compute.manager [None req-9cab6225-addf-4bf1-aab8-63a0924bc801 - - - - - -] [instance: f7dc4fea-4c3d-40cb-b9dd-53d52f3873b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.595 187212 INFO nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Creating config drive at /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.600 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3p2upt_q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.646 187212 DEBUG nova.compute.manager [req-582fbcf8-c865-4a1e-825a-59b051dcd8f0 req-36229c29-189d-4c45-8888-59d5230da34c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.646 187212 DEBUG oslo_concurrency.lockutils [req-582fbcf8-c865-4a1e-825a-59b051dcd8f0 req-36229c29-189d-4c45-8888-59d5230da34c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.647 187212 DEBUG oslo_concurrency.lockutils [req-582fbcf8-c865-4a1e-825a-59b051dcd8f0 req-36229c29-189d-4c45-8888-59d5230da34c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.647 187212 DEBUG oslo_concurrency.lockutils [req-582fbcf8-c865-4a1e-825a-59b051dcd8f0 req-36229c29-189d-4c45-8888-59d5230da34c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.647 187212 DEBUG nova.compute.manager [req-582fbcf8-c865-4a1e-825a-59b051dcd8f0 req-36229c29-189d-4c45-8888-59d5230da34c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Processing event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.724 187212 DEBUG oslo_concurrency.processutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3p2upt_q" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:15:56 np0005546909 kernel: tapb5a9a5df-a9: entered promiscuous mode
Dec  5 07:15:56 np0005546909 systemd-udevd[239874]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:15:56 np0005546909 NetworkManager[55691]: <info>  [1764936956.7831] manager: (tapb5a9a5df-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/404)
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.783 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:56Z|01057|binding|INFO|Claiming lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e for this chassis.
Dec  5 07:15:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:56Z|01058|binding|INFO|b5a9a5df-a95c-46bb-b043-0ff6ae79599e: Claiming fa:16:3e:6b:1e:ff 10.100.0.8
Dec  5 07:15:56 np0005546909 NetworkManager[55691]: <info>  [1764936956.7946] device (tapb5a9a5df-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:15:56 np0005546909 NetworkManager[55691]: <info>  [1764936956.7969] device (tapb5a9a5df-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:15:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:56Z|01059|binding|INFO|Setting lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e ovn-installed in OVS
Dec  5 07:15:56 np0005546909 ovn_controller[95610]: 2025-12-05T12:15:56Z|01060|binding|INFO|Setting lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e up in Southbound
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.797 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:1e:ff 10.100.0.8'], port_security=['fa:16:3e:6b:1e:ff 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b9ba9fad-eaef-4c3b-9793-23053fe1ace1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '10cc7c6d-475c-43b4-8ab3-df3294aff9b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b5a9a5df-a95c-46bb-b043-0ff6ae79599e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.798 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b5a9a5df-a95c-46bb-b043-0ff6ae79599e in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 bound to our chassis#033[00m
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.800 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.832 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.835 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.839 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[64e3ec6b-8062-43bf-a3ba-3eab46166dda]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.869 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[d95f8e90-f9d4-4055-9087-c41393b788f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.873 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[dea285dd-bd9a-4a7d-8a2e-7d4300c562b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:56 np0005546909 systemd-machined[153543]: New machine qemu-122-instance-00000063.
Dec  5 07:15:56 np0005546909 systemd[1]: Started Virtual Machine qemu-122-instance-00000063.
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.904 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[48ef2128-8382-44d0-b85d-5c94da684778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.923 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe6a555-d681-4af4-bda8-13b643060c68]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 10, 'rx_bytes': 616, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 10, 'rx_bytes': 616, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426014, 'reachable_time': 23848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239953, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.942 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dcd47ae2-b1c5-46bc-8114-aae74769b441]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426028, 'tstamp': 426028}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239956, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426031, 'tstamp': 426031}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239956, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.944 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.968 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:56 np0005546909 nova_compute[187208]: 2025-12-05 12:15:56.971 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.973 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.974 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.974 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:56.974 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.138 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for b9ba9fad-eaef-4c3b-9793-23053fe1ace1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.139 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936957.1384003, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.139 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Started (Lifecycle Event)#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.142 187212 DEBUG nova.compute.manager [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.145 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.149 187212 INFO nova.virt.libvirt.driver [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance spawned successfully.#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.149 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.162 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.167 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.172 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.173 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.173 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.174 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.175 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.175 187212 DEBUG nova.virt.libvirt.driver [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.181 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.202 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.203 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936957.1385927, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.203 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.239 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.242 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936957.1446872, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.242 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.249 187212 DEBUG nova.compute.manager [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.276 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.280 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.324 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.324 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.325 187212 DEBUG nova.objects.instance [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:15:57 np0005546909 nova_compute[187208]: 2025-12-05 12:15:57.419 187212 DEBUG oslo_concurrency.lockutils [None req-fab90aac-d102-4fee-96f0-95427d622f1d 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:15:58.356 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.822 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.975 187212 DEBUG nova.compute.manager [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.976 187212 DEBUG oslo_concurrency.lockutils [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.976 187212 DEBUG oslo_concurrency.lockutils [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.976 187212 DEBUG oslo_concurrency.lockutils [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.976 187212 DEBUG nova.compute.manager [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] No waiting events found dispatching network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.976 187212 WARNING nova.compute.manager [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received unexpected event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e for instance with vm_state active and task_state None.#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.977 187212 DEBUG nova.compute.manager [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.977 187212 DEBUG oslo_concurrency.lockutils [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.977 187212 DEBUG oslo_concurrency.lockutils [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.977 187212 DEBUG oslo_concurrency.lockutils [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.978 187212 DEBUG nova.compute.manager [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] No waiting events found dispatching network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:15:58 np0005546909 nova_compute[187208]: 2025-12-05 12:15:58.978 187212 WARNING nova.compute.manager [req-032beee4-82c9-4d36-82cb-1d956d96272e req-6d1e24b0-9fa3-4ab0-a238-1da9b91adb58 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received unexpected event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e for instance with vm_state active and task_state None.#033[00m
Dec  5 07:15:59 np0005546909 podman[239969]: 2025-12-05 12:15:59.196837771 +0000 UTC m=+0.052575621 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:16:00 np0005546909 nova_compute[187208]: 2025-12-05 12:16:00.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:03.020 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:03.021 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:03.022 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:03 np0005546909 nova_compute[187208]: 2025-12-05 12:16:03.824 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:04 np0005546909 podman[239993]: 2025-12-05 12:16:04.223321666 +0000 UTC m=+0.068921094 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.121 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.123 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.123 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.123 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.124 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.125 187212 INFO nova.compute.manager [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Terminating instance#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.126 187212 DEBUG nova.compute.manager [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:16:05 np0005546909 kernel: tapb5a9a5df-a9 (unregistering): left promiscuous mode
Dec  5 07:16:05 np0005546909 NetworkManager[55691]: <info>  [1764936965.1481] device (tapb5a9a5df-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.158 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:05Z|01061|binding|INFO|Releasing lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e from this chassis (sb_readonly=0)
Dec  5 07:16:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:05Z|01062|binding|INFO|Setting lport b5a9a5df-a95c-46bb-b043-0ff6ae79599e down in Southbound
Dec  5 07:16:05 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:05Z|01063|binding|INFO|Removing iface tapb5a9a5df-a9 ovn-installed in OVS
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.169 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:1e:ff 10.100.0.8'], port_security=['fa:16:3e:6b:1e:ff 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b9ba9fad-eaef-4c3b-9793-23053fe1ace1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '10cc7c6d-475c-43b4-8ab3-df3294aff9b7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=b5a9a5df-a95c-46bb-b043-0ff6ae79599e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.170 104471 INFO neutron.agent.ovn.metadata.agent [-] Port b5a9a5df-a95c-46bb-b043-0ff6ae79599e in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.171 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.175 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.190 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35ae5ce2-f618-4729-a1e1-912de0dc1dfa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:05 np0005546909 systemd[1]: machine-qemu\x2d122\x2dinstance\x2d00000063.scope: Deactivated successfully.
Dec  5 07:16:05 np0005546909 systemd[1]: machine-qemu\x2d122\x2dinstance\x2d00000063.scope: Consumed 8.309s CPU time.
Dec  5 07:16:05 np0005546909 systemd-machined[153543]: Machine qemu-122-instance-00000063 terminated.
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.230 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[81581326-0e4c-4fd4-9ee1-14cda6845459]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.235 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9f247d49-73b0-4b0c-8a54-6fa0b5edc585]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.270 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[a620b149-5570-4dec-9b30-442b56205d00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.291 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[52efdc18-9ab2-4a57-b6b0-e184b9bcea2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 12, 'rx_bytes': 616, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 12, 'rx_bytes': 616, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 286], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426014, 'reachable_time': 23848, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240025, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.309 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2dcd973f-d69d-4845-81ff-344def117080]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426028, 'tstamp': 426028}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240026, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf9ed41c2-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426031, 'tstamp': 426031}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240026, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.311 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.317 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.318 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.319 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.319 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:05.320 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.405 187212 INFO nova.virt.libvirt.driver [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Instance destroyed successfully.#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.406 187212 DEBUG nova.objects.instance [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'resources' on Instance uuid b9ba9fad-eaef-4c3b-9793-23053fe1ace1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.421 187212 DEBUG nova.virt.libvirt.vif [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-05T12:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-58863967',display_name='tempest-ServerActionsTestJSON-server-975018653',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-58863967',id=99,image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:15:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-irsjw7eb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6e277715-617f-4e35-89c7-208beae9fd5c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:15:57Z,user_data=None,user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=b9ba9fad-eaef-4c3b-9793-23053fe1ace1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.421 187212 DEBUG nova.network.os_vif_util [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "address": "fa:16:3e:6b:1e:ff", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5a9a5df-a9", "ovs_interfaceid": "b5a9a5df-a95c-46bb-b043-0ff6ae79599e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.422 187212 DEBUG nova.network.os_vif_util [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.422 187212 DEBUG os_vif [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.426 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.427 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5a9a5df-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.428 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.431 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.434 187212 INFO os_vif [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:1e:ff,bridge_name='br-int',has_traffic_filtering=True,id=b5a9a5df-a95c-46bb-b043-0ff6ae79599e,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5a9a5df-a9')#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.434 187212 INFO nova.virt.libvirt.driver [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Deleting instance files /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1_del#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.435 187212 INFO nova.virt.libvirt.driver [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Deletion of /var/lib/nova/instances/b9ba9fad-eaef-4c3b-9793-23053fe1ace1_del complete#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.484 187212 INFO nova.compute.manager [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.485 187212 DEBUG oslo.service.loopingcall [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.485 187212 DEBUG nova.compute.manager [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:16:05 np0005546909 nova_compute[187208]: 2025-12-05 12:16:05.485 187212 DEBUG nova.network.neutron [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:16:06 np0005546909 nova_compute[187208]: 2025-12-05 12:16:06.383 187212 DEBUG nova.compute.manager [req-8ef24a47-525a-430d-90aa-0e0047d593ae req-b87002a6-0c77-4dd8-ad75-fa2c3d4d3cf0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-unplugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:06 np0005546909 nova_compute[187208]: 2025-12-05 12:16:06.383 187212 DEBUG oslo_concurrency.lockutils [req-8ef24a47-525a-430d-90aa-0e0047d593ae req-b87002a6-0c77-4dd8-ad75-fa2c3d4d3cf0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:06 np0005546909 nova_compute[187208]: 2025-12-05 12:16:06.383 187212 DEBUG oslo_concurrency.lockutils [req-8ef24a47-525a-430d-90aa-0e0047d593ae req-b87002a6-0c77-4dd8-ad75-fa2c3d4d3cf0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:06 np0005546909 nova_compute[187208]: 2025-12-05 12:16:06.384 187212 DEBUG oslo_concurrency.lockutils [req-8ef24a47-525a-430d-90aa-0e0047d593ae req-b87002a6-0c77-4dd8-ad75-fa2c3d4d3cf0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:06 np0005546909 nova_compute[187208]: 2025-12-05 12:16:06.384 187212 DEBUG nova.compute.manager [req-8ef24a47-525a-430d-90aa-0e0047d593ae req-b87002a6-0c77-4dd8-ad75-fa2c3d4d3cf0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] No waiting events found dispatching network-vif-unplugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:16:06 np0005546909 nova_compute[187208]: 2025-12-05 12:16:06.384 187212 DEBUG nova.compute.manager [req-8ef24a47-525a-430d-90aa-0e0047d593ae req-b87002a6-0c77-4dd8-ad75-fa2c3d4d3cf0 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-unplugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:16:07 np0005546909 nova_compute[187208]: 2025-12-05 12:16:07.794 187212 DEBUG nova.network.neutron [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:16:07 np0005546909 nova_compute[187208]: 2025-12-05 12:16:07.938 187212 INFO nova.compute.manager [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Took 2.45 seconds to deallocate network for instance.#033[00m
Dec  5 07:16:07 np0005546909 nova_compute[187208]: 2025-12-05 12:16:07.990 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:07 np0005546909 nova_compute[187208]: 2025-12-05 12:16:07.991 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.093 187212 DEBUG nova.compute.provider_tree [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.127 187212 DEBUG nova.scheduler.client.report [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.157 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.187 187212 INFO nova.scheduler.client.report [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Deleted allocations for instance b9ba9fad-eaef-4c3b-9793-23053fe1ace1#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.281 187212 DEBUG oslo_concurrency.lockutils [None req-7457acc1-1c67-454b-b6fa-878c2d777c59 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.301 187212 DEBUG nova.compute.manager [req-7be324cb-872b-4237-9d15-923d0a0bd071 req-1d4a94ab-1f03-4afd-8bf4-5a075add4988 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-deleted-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.767 187212 DEBUG nova.compute.manager [req-519aacab-28de-42cf-a999-1cb323cbd1ec req-21eb7fbb-b590-4196-ae77-b19f82200a8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.768 187212 DEBUG oslo_concurrency.lockutils [req-519aacab-28de-42cf-a999-1cb323cbd1ec req-21eb7fbb-b590-4196-ae77-b19f82200a8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.768 187212 DEBUG oslo_concurrency.lockutils [req-519aacab-28de-42cf-a999-1cb323cbd1ec req-21eb7fbb-b590-4196-ae77-b19f82200a8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.768 187212 DEBUG oslo_concurrency.lockutils [req-519aacab-28de-42cf-a999-1cb323cbd1ec req-21eb7fbb-b590-4196-ae77-b19f82200a8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b9ba9fad-eaef-4c3b-9793-23053fe1ace1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.768 187212 DEBUG nova.compute.manager [req-519aacab-28de-42cf-a999-1cb323cbd1ec req-21eb7fbb-b590-4196-ae77-b19f82200a8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] No waiting events found dispatching network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.769 187212 WARNING nova.compute.manager [req-519aacab-28de-42cf-a999-1cb323cbd1ec req-21eb7fbb-b590-4196-ae77-b19f82200a8b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Received unexpected event network-vif-plugged-b5a9a5df-a95c-46bb-b043-0ff6ae79599e for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:16:08 np0005546909 nova_compute[187208]: 2025-12-05 12:16:08.825 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:09 np0005546909 podman[240046]: 2025-12-05 12:16:09.24631107 +0000 UTC m=+0.083465395 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 07:16:09 np0005546909 podman[240045]: 2025-12-05 12:16:09.258395489 +0000 UTC m=+0.101864056 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 07:16:10 np0005546909 nova_compute[187208]: 2025-12-05 12:16:10.458 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:13 np0005546909 nova_compute[187208]: 2025-12-05 12:16:13.827 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:14 np0005546909 nova_compute[187208]: 2025-12-05 12:16:14.981 187212 DEBUG oslo_concurrency.lockutils [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:14 np0005546909 nova_compute[187208]: 2025-12-05 12:16:14.982 187212 DEBUG oslo_concurrency.lockutils [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:14 np0005546909 nova_compute[187208]: 2025-12-05 12:16:14.983 187212 DEBUG nova.compute.manager [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:16:14 np0005546909 nova_compute[187208]: 2025-12-05 12:16:14.986 187212 DEBUG nova.compute.manager [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Dec  5 07:16:14 np0005546909 nova_compute[187208]: 2025-12-05 12:16:14.987 187212 DEBUG nova.objects.instance [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'flavor' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:15 np0005546909 nova_compute[187208]: 2025-12-05 12:16:15.010 187212 DEBUG nova.virt.libvirt.driver [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:16:15 np0005546909 nova_compute[187208]: 2025-12-05 12:16:15.460 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:17 np0005546909 kernel: tap5316adeb-5a (unregistering): left promiscuous mode
Dec  5 07:16:17 np0005546909 NetworkManager[55691]: <info>  [1764936977.1558] device (tap5316adeb-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.162 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:17Z|01064|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=0)
Dec  5 07:16:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:17Z|01065|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down in Southbound
Dec  5 07:16:17 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:17Z|01066|binding|INFO|Removing iface tap5316adeb-5a ovn-installed in OVS
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.171 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.173 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.174 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.175 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ca16284a-a7e1-4abf-95de-de4c12876b25]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.176 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace which is not needed anymore#033[00m
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:17 np0005546909 systemd[1]: machine-qemu\x2d117\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Dec  5 07:16:17 np0005546909 systemd[1]: machine-qemu\x2d117\x2dinstance\x2d0000005c.scope: Consumed 17.600s CPU time.
Dec  5 07:16:17 np0005546909 systemd-machined[153543]: Machine qemu-117-instance-0000005c terminated.
Dec  5 07:16:17 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[239098]: [NOTICE]   (239103) : haproxy version is 2.8.14-c23fe91
Dec  5 07:16:17 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[239098]: [NOTICE]   (239103) : path to executable is /usr/sbin/haproxy
Dec  5 07:16:17 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[239098]: [WARNING]  (239103) : Exiting Master process...
Dec  5 07:16:17 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[239098]: [ALERT]    (239103) : Current worker (239105) exited with code 143 (Terminated)
Dec  5 07:16:17 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[239098]: [WARNING]  (239103) : All workers exited. Exiting... (0)
Dec  5 07:16:17 np0005546909 systemd[1]: libpod-f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14.scope: Deactivated successfully.
Dec  5 07:16:17 np0005546909 podman[240108]: 2025-12-05 12:16:17.314491264 +0000 UTC m=+0.048163174 container died f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:16:17 np0005546909 systemd[1]: var-lib-containers-storage-overlay-2881d06722a68d1b04f02f9f59234c376caaf5f0e8e911e5a9b452b4d2ddaaf9-merged.mount: Deactivated successfully.
Dec  5 07:16:17 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14-userdata-shm.mount: Deactivated successfully.
Dec  5 07:16:17 np0005546909 podman[240108]: 2025-12-05 12:16:17.361677448 +0000 UTC m=+0.095349368 container cleanup f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Dec  5 07:16:17 np0005546909 systemd[1]: libpod-conmon-f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14.scope: Deactivated successfully.
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.388 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.393 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:17 np0005546909 podman[240139]: 2025-12-05 12:16:17.441469695 +0000 UTC m=+0.056126474 container remove f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.448 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2f2bdc7f-2d25-45f4-a0ba-da2b93424e2e]: (4, ('Fri Dec  5 12:16:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14)\nf36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14\nFri Dec  5 12:16:17 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (f36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14)\nf36eb0fc14b414186c6d21340214aa8d5ad19ff723bb7cdf6999288989b93a14\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.451 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a8984c32-b7d6-46ef-bb3d-2a9abfdef3ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.452 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.454 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:17 np0005546909 kernel: tapf9ed41c2-b0: left promiscuous mode
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.469 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.472 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb775db-bfcf-47c4-ba8d-045566accd02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.492 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ecd603-b4da-46b5-8f49-cff577a4902d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.493 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[533a20e5-5281-4523-b481-30f832faf42a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.509 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d4d5fc76-1341-4c23-95d3-eab39c39bb14]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426006, 'reachable_time': 16236, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240174, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.513 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:16:17 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:17.513 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[35de48f5-3ce2-4168-afb9-79ebf813f235]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:17 np0005546909 systemd[1]: run-netns-ovnmeta\x2df9ed41c2\x2db085\x2d41ff\x2dac71\x2d6256a4e30e85.mount: Deactivated successfully.
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.637 187212 DEBUG nova.compute.manager [req-17f4b587-1e56-4628-8e1f-1e609b0417bb req-8e15e76c-9891-4b06-8fd3-52a078c9f562 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.638 187212 DEBUG oslo_concurrency.lockutils [req-17f4b587-1e56-4628-8e1f-1e609b0417bb req-8e15e76c-9891-4b06-8fd3-52a078c9f562 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.638 187212 DEBUG oslo_concurrency.lockutils [req-17f4b587-1e56-4628-8e1f-1e609b0417bb req-8e15e76c-9891-4b06-8fd3-52a078c9f562 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.639 187212 DEBUG oslo_concurrency.lockutils [req-17f4b587-1e56-4628-8e1f-1e609b0417bb req-8e15e76c-9891-4b06-8fd3-52a078c9f562 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.639 187212 DEBUG nova.compute.manager [req-17f4b587-1e56-4628-8e1f-1e609b0417bb req-8e15e76c-9891-4b06-8fd3-52a078c9f562 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:16:17 np0005546909 nova_compute[187208]: 2025-12-05 12:16:17.639 187212 WARNING nova.compute.manager [req-17f4b587-1e56-4628-8e1f-1e609b0417bb req-8e15e76c-9891-4b06-8fd3-52a078c9f562 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state powering-off.#033[00m
Dec  5 07:16:18 np0005546909 nova_compute[187208]: 2025-12-05 12:16:18.028 187212 INFO nova.virt.libvirt.driver [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance shutdown successfully after 3 seconds.#033[00m
Dec  5 07:16:18 np0005546909 nova_compute[187208]: 2025-12-05 12:16:18.035 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance destroyed successfully.#033[00m
Dec  5 07:16:18 np0005546909 nova_compute[187208]: 2025-12-05 12:16:18.036 187212 DEBUG nova.objects.instance [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:18 np0005546909 nova_compute[187208]: 2025-12-05 12:16:18.051 187212 DEBUG nova.compute.manager [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:16:18 np0005546909 nova_compute[187208]: 2025-12-05 12:16:18.100 187212 DEBUG oslo_concurrency.lockutils [None req-adc2d0e2-de79-43a9-a0a9-1f2661fed9b9 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:18 np0005546909 nova_compute[187208]: 2025-12-05 12:16:18.829 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:19 np0005546909 podman[240176]: 2025-12-05 12:16:19.21150662 +0000 UTC m=+0.059322856 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 07:16:19 np0005546909 podman[240175]: 2025-12-05 12:16:19.217425531 +0000 UTC m=+0.065563977 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 07:16:19 np0005546909 podman[240177]: 2025-12-05 12:16:19.236849043 +0000 UTC m=+0.081902909 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.137 187212 DEBUG nova.compute.manager [req-925ef461-1b50-4755-849a-9d6905522c7e req-0d79d737-551c-46c1-ae96-d820105d76c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.138 187212 DEBUG oslo_concurrency.lockutils [req-925ef461-1b50-4755-849a-9d6905522c7e req-0d79d737-551c-46c1-ae96-d820105d76c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.138 187212 DEBUG oslo_concurrency.lockutils [req-925ef461-1b50-4755-849a-9d6905522c7e req-0d79d737-551c-46c1-ae96-d820105d76c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.139 187212 DEBUG oslo_concurrency.lockutils [req-925ef461-1b50-4755-849a-9d6905522c7e req-0d79d737-551c-46c1-ae96-d820105d76c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.139 187212 DEBUG nova.compute.manager [req-925ef461-1b50-4755-849a-9d6905522c7e req-0d79d737-551c-46c1-ae96-d820105d76c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.139 187212 WARNING nova.compute.manager [req-925ef461-1b50-4755-849a-9d6905522c7e req-0d79d737-551c-46c1-ae96-d820105d76c1 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state stopped and task_state None.#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.263 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'flavor' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.288 187212 DEBUG oslo_concurrency.lockutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.288 187212 DEBUG oslo_concurrency.lockutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquired lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.289 187212 DEBUG nova.network.neutron [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.289 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'info_cache' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.403 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764936965.402231, b9ba9fad-eaef-4c3b-9793-23053fe1ace1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.404 187212 INFO nova.compute.manager [-] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.435 187212 DEBUG nova.compute.manager [None req-aff28276-aa34-4b41-bc55-5e9304435a67 - - - - - -] [instance: b9ba9fad-eaef-4c3b-9793-23053fe1ace1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:16:20 np0005546909 nova_compute[187208]: 2025-12-05 12:16:20.462 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:23 np0005546909 nova_compute[187208]: 2025-12-05 12:16:23.831 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.188 187212 DEBUG nova.network.neutron [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.218 187212 DEBUG oslo_concurrency.lockutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Releasing lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.249 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance destroyed successfully.#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.250 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.265 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'resources' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.279 187212 DEBUG nova.virt.libvirt.vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:16:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.279 187212 DEBUG nova.network.os_vif_util [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.280 187212 DEBUG nova.network.os_vif_util [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.281 187212 DEBUG os_vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.282 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.282 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5316adeb-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.312 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.317 187212 INFO os_vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.324 187212 DEBUG nova.virt.libvirt.driver [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Start _get_guest_xml network_info=[{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.327 187212 WARNING nova.virt.libvirt.driver [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.333 187212 DEBUG nova.virt.libvirt.host [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.334 187212 DEBUG nova.virt.libvirt.host [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.339 187212 DEBUG nova.virt.libvirt.host [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.340 187212 DEBUG nova.virt.libvirt.host [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.340 187212 DEBUG nova.virt.libvirt.driver [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.340 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.341 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.341 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.341 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.342 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.342 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.342 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.342 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.343 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.343 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.343 187212 DEBUG nova.virt.hardware [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.344 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.363 187212 DEBUG nova.virt.libvirt.vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:16:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.364 187212 DEBUG nova.network.os_vif_util [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.364 187212 DEBUG nova.network.os_vif_util [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.366 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.385 187212 DEBUG nova.virt.libvirt.driver [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  <uuid>2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</uuid>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  <name>instance-0000005c</name>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerActionsTestJSON-server-954339420</nova:name>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:16:25</nova:creationTime>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:        <nova:user uuid="41799f35c2764b25912247e2e8e2e9c5">tempest-ServerActionsTestJSON-1748869140-project-member</nova:user>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:        <nova:project uuid="75752a4cc8f7487e8dc4440201f894c8">tempest-ServerActionsTestJSON-1748869140</nova:project>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:        <nova:port uuid="5316adeb-5a49-4a58-b997-f132a083ff13">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <entry name="serial">2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</entry>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <entry name="uuid">2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c</entry>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk.config"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:9a:d0:34"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <target dev="tap5316adeb-5a"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/console.log" append="off"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <input type="keyboard" bus="usb"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:16:25 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:16:25 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:16:25 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:16:25 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.387 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.472 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.473 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.529 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.531 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.546 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.603 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.604 187212 DEBUG nova.virt.disk.api [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Checking if we can resize image /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.605 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.664 187212 DEBUG oslo_concurrency.processutils [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.665 187212 DEBUG nova.virt.disk.api [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Cannot resize image /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.665 187212 DEBUG nova.objects.instance [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'migration_context' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.681 187212 DEBUG nova.virt.libvirt.vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:16:18Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.681 187212 DEBUG nova.network.os_vif_util [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.682 187212 DEBUG nova.network.os_vif_util [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.683 187212 DEBUG os_vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.683 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.684 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.684 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.687 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.687 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5316adeb-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.687 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5316adeb-5a, col_values=(('external_ids', {'iface-id': '5316adeb-5a49-4a58-b997-f132a083ff13', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:d0:34', 'vm-uuid': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.689 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:25 np0005546909 NetworkManager[55691]: <info>  [1764936985.6903] manager: (tap5316adeb-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/405)
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.694 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.695 187212 INFO os_vif [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')#033[00m
Dec  5 07:16:25 np0005546909 kernel: tap5316adeb-5a: entered promiscuous mode
Dec  5 07:16:25 np0005546909 NetworkManager[55691]: <info>  [1764936985.7822] manager: (tap5316adeb-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/406)
Dec  5 07:16:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:25Z|01067|binding|INFO|Claiming lport 5316adeb-5a49-4a58-b997-f132a083ff13 for this chassis.
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.784 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:25Z|01068|binding|INFO|5316adeb-5a49-4a58-b997-f132a083ff13: Claiming fa:16:3e:9a:d0:34 10.100.0.5
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.793 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '11', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.794 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 bound to our chassis#033[00m
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.795 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85#033[00m
Dec  5 07:16:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:25Z|01069|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 ovn-installed in OVS
Dec  5 07:16:25 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:25Z|01070|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 up in Southbound
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.798 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.803 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:25 np0005546909 nova_compute[187208]: 2025-12-05 12:16:25.806 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.809 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3c46efc6-f054-4623-9eb2-390b7baebe07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.810 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9ed41c2-b1 in ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:16:25 np0005546909 systemd-udevd[240270]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.813 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9ed41c2-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.813 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8a810ec2-1539-4ae1-bc12-42f3a9982761]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.814 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f05f3e-b245-4494-beae-901e51f0036d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:25 np0005546909 NetworkManager[55691]: <info>  [1764936985.8254] device (tap5316adeb-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.825 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[938b0bc1-bc0b-4aad-aefb-cb1611916b5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:25 np0005546909 NetworkManager[55691]: <info>  [1764936985.8263] device (tap5316adeb-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:16:25 np0005546909 systemd-machined[153543]: New machine qemu-123-instance-0000005c.
Dec  5 07:16:25 np0005546909 systemd[1]: Started Virtual Machine qemu-123-instance-0000005c.
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.841 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[10cc03c4-35b7-4ddf-a24b-d2be6b27ca31]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.869 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0f408ead-59e0-46a6-b93a-4601a0f3c570]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:25 np0005546909 systemd-udevd[240275]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:16:25 np0005546909 NetworkManager[55691]: <info>  [1764936985.8758] manager: (tapf9ed41c2-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/407)
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.875 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eec35729-328c-4a27-88c4-1063e002ed99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.904 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[3c581962-c232-4517-8ecc-f5dd67058804]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.907 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e7048f33-78c0-4d14-90a1-5d0450a7d106]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:25 np0005546909 NetworkManager[55691]: <info>  [1764936985.9330] device (tapf9ed41c2-b0): carrier: link connected
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.937 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[8669df66-c7a7-4fb7-b57b-b2f2f6370645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.958 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7f2f2f69-b634-4669-bd39-bb2ef0b09c7a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 294], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436649, 'reachable_time': 33956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240304, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.973 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[641a07ae-4161-4222-9901-86f3c24b4f00]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:9111'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 436649, 'tstamp': 436649}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240305, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:25.990 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af98b762-31bc-4197-9c11-ee8f6bfc439d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 294], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436649, 'reachable_time': 33956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240306, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.023 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0e1321a0-4978-416c-a11b-4e725fbab641]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.082 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af89a8e6-c2fb-461e-a3fe-33508738eaac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.084 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.084 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.084 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:26 np0005546909 NetworkManager[55691]: <info>  [1764936986.0871] manager: (tapf9ed41c2-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/408)
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.087 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:26 np0005546909 kernel: tapf9ed41c2-b0: entered promiscuous mode
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.090 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:26 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:26Z|01071|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.105 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.106 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.107 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0c174a89-49dc-4963-a383-aca469768b5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.108 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:16:26 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:26.109 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'env', 'PROCESS_TAG=haproxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9ed41c2-b085-41ff-ac71-6256a4e30e85.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.189 187212 DEBUG nova.compute.manager [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.192 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.192 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936986.189252, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.193 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.197 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance rebooted successfully.#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.198 187212 DEBUG nova.compute.manager [None req-08a1474e-38a1-4e6b-aaca-73dbe6b8b303 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.211 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.214 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.234 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.235 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936986.1904628, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.235 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Started (Lifecycle Event)#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.265 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.268 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.425 187212 DEBUG nova.compute.manager [req-82f714dc-91d2-4af8-a174-8c4eb96d0333 req-64689725-9f60-470c-9e2c-9ab6ff118dee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.426 187212 DEBUG oslo_concurrency.lockutils [req-82f714dc-91d2-4af8-a174-8c4eb96d0333 req-64689725-9f60-470c-9e2c-9ab6ff118dee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.426 187212 DEBUG oslo_concurrency.lockutils [req-82f714dc-91d2-4af8-a174-8c4eb96d0333 req-64689725-9f60-470c-9e2c-9ab6ff118dee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.426 187212 DEBUG oslo_concurrency.lockutils [req-82f714dc-91d2-4af8-a174-8c4eb96d0333 req-64689725-9f60-470c-9e2c-9ab6ff118dee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.427 187212 DEBUG nova.compute.manager [req-82f714dc-91d2-4af8-a174-8c4eb96d0333 req-64689725-9f60-470c-9e2c-9ab6ff118dee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:16:26 np0005546909 nova_compute[187208]: 2025-12-05 12:16:26.427 187212 WARNING nova.compute.manager [req-82f714dc-91d2-4af8-a174-8c4eb96d0333 req-64689725-9f60-470c-9e2c-9ab6ff118dee 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:16:26 np0005546909 podman[240342]: 2025-12-05 12:16:26.540585056 +0000 UTC m=+0.050826001 container create b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec  5 07:16:26 np0005546909 systemd[1]: Started libpod-conmon-b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c.scope.
Dec  5 07:16:26 np0005546909 podman[240342]: 2025-12-05 12:16:26.513552994 +0000 UTC m=+0.023793969 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:16:26 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:16:26 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c79a167d190ccd5eec04f331a8b94c8e257ebf6ac48b138dc6af8474b8be0235/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:16:26 np0005546909 podman[240342]: 2025-12-05 12:16:26.623969807 +0000 UTC m=+0.134210752 container init b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:16:26 np0005546909 podman[240342]: 2025-12-05 12:16:26.628544919 +0000 UTC m=+0.138785864 container start b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec  5 07:16:26 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [NOTICE]   (240361) : New worker (240363) forked
Dec  5 07:16:26 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [NOTICE]   (240361) : Loading success.
Dec  5 07:16:28 np0005546909 nova_compute[187208]: 2025-12-05 12:16:28.737 187212 DEBUG nova.compute.manager [req-687fd1a0-37b7-49af-923b-208bc57fcc56 req-4d26d06e-471f-496b-88c1-e237af2b1342 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:28 np0005546909 nova_compute[187208]: 2025-12-05 12:16:28.738 187212 DEBUG oslo_concurrency.lockutils [req-687fd1a0-37b7-49af-923b-208bc57fcc56 req-4d26d06e-471f-496b-88c1-e237af2b1342 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:28 np0005546909 nova_compute[187208]: 2025-12-05 12:16:28.739 187212 DEBUG oslo_concurrency.lockutils [req-687fd1a0-37b7-49af-923b-208bc57fcc56 req-4d26d06e-471f-496b-88c1-e237af2b1342 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:28 np0005546909 nova_compute[187208]: 2025-12-05 12:16:28.739 187212 DEBUG oslo_concurrency.lockutils [req-687fd1a0-37b7-49af-923b-208bc57fcc56 req-4d26d06e-471f-496b-88c1-e237af2b1342 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:28 np0005546909 nova_compute[187208]: 2025-12-05 12:16:28.739 187212 DEBUG nova.compute.manager [req-687fd1a0-37b7-49af-923b-208bc57fcc56 req-4d26d06e-471f-496b-88c1-e237af2b1342 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:16:28 np0005546909 nova_compute[187208]: 2025-12-05 12:16:28.740 187212 WARNING nova.compute.manager [req-687fd1a0-37b7-49af-923b-208bc57fcc56 req-4d26d06e-471f-496b-88c1-e237af2b1342 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:16:28 np0005546909 nova_compute[187208]: 2025-12-05 12:16:28.833 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:30 np0005546909 podman[240373]: 2025-12-05 12:16:30.226213164 +0000 UTC m=+0.082611650 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:16:30 np0005546909 nova_compute[187208]: 2025-12-05 12:16:30.689 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:33 np0005546909 nova_compute[187208]: 2025-12-05 12:16:33.835 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:34 np0005546909 nova_compute[187208]: 2025-12-05 12:16:34.406 187212 DEBUG nova.objects.instance [None req-559428be-cb06-41d2-8164-1f861638a55e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:34 np0005546909 nova_compute[187208]: 2025-12-05 12:16:34.437 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764936994.4376903, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:16:34 np0005546909 nova_compute[187208]: 2025-12-05 12:16:34.438 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:16:34 np0005546909 nova_compute[187208]: 2025-12-05 12:16:34.460 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:16:34 np0005546909 nova_compute[187208]: 2025-12-05 12:16:34.464 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:16:34 np0005546909 nova_compute[187208]: 2025-12-05 12:16:34.485 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Dec  5 07:16:35 np0005546909 podman[240401]: 2025-12-05 12:16:35.197832091 +0000 UTC m=+0.056222096 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:16:35 np0005546909 kernel: tap5316adeb-5a (unregistering): left promiscuous mode
Dec  5 07:16:35 np0005546909 NetworkManager[55691]: <info>  [1764936995.2431] device (tap5316adeb-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:16:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:35Z|01072|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=0)
Dec  5 07:16:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:35Z|01073|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down in Southbound
Dec  5 07:16:35 np0005546909 nova_compute[187208]: 2025-12-05 12:16:35.250 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:35 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:35Z|01074|binding|INFO|Removing iface tap5316adeb-5a ovn-installed in OVS
Dec  5 07:16:35 np0005546909 nova_compute[187208]: 2025-12-05 12:16:35.252 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:35 np0005546909 nova_compute[187208]: 2025-12-05 12:16:35.265 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:35 np0005546909 systemd[1]: machine-qemu\x2d123\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Dec  5 07:16:35 np0005546909 systemd[1]: machine-qemu\x2d123\x2dinstance\x2d0000005c.scope: Consumed 9.100s CPU time.
Dec  5 07:16:35 np0005546909 systemd-machined[153543]: Machine qemu-123-instance-0000005c terminated.
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.420 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '12', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.422 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis#033[00m
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.424 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.425 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eee0ce79-6154-4337-8215-35737826d9f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.426 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace which is not needed anymore#033[00m
Dec  5 07:16:35 np0005546909 nova_compute[187208]: 2025-12-05 12:16:35.449 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:35 np0005546909 nova_compute[187208]: 2025-12-05 12:16:35.455 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.493 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'name': 'tempest-ServerActionsTestJSON-server-954339420', 'flavor': {'id': 'dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a6987852-063f-405d-a848-6b382694811e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '75752a4cc8f7487e8dc4440201f894c8', 'user_id': '41799f35c2764b25912247e2e8e2e9c5', 'hostId': '60bdcc72c489e9e9b7670f227f931ad6575d4971458efc9f6db733d2', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.494 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec  5 07:16:35 np0005546909 nova_compute[187208]: 2025-12-05 12:16:35.494 187212 DEBUG nova.compute.manager [None req-559428be-cb06-41d2-8164-1f861638a55e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.496 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.496 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.497 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.497 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.497 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.498 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.498 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.498 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.499 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.499 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.501 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.501 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.502 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.502 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.502 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.503 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.503 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.503 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.504 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.504 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.504 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.504 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.505 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.505 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.506 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.506 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.507 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.507 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.507 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.507 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.508 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.508 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.508 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.509 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.509 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.510 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.510 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.510 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.510 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.511 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.511 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.511 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.512 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.512 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec  5 07:16:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:16:35.513 12 DEBUG ceilometer.compute.pollsters [-] Instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-0000005c, id=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec  5 07:16:35 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [NOTICE]   (240361) : haproxy version is 2.8.14-c23fe91
Dec  5 07:16:35 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [NOTICE]   (240361) : path to executable is /usr/sbin/haproxy
Dec  5 07:16:35 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [WARNING]  (240361) : Exiting Master process...
Dec  5 07:16:35 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [ALERT]    (240361) : Current worker (240363) exited with code 143 (Terminated)
Dec  5 07:16:35 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240357]: [WARNING]  (240361) : All workers exited. Exiting... (0)
Dec  5 07:16:35 np0005546909 systemd[1]: libpod-b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c.scope: Deactivated successfully.
Dec  5 07:16:35 np0005546909 conmon[240357]: conmon b81853b290644b432f54 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c.scope/container/memory.events
Dec  5 07:16:35 np0005546909 podman[240463]: 2025-12-05 12:16:35.577700734 +0000 UTC m=+0.049222024 container died b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  5 07:16:35 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c-userdata-shm.mount: Deactivated successfully.
Dec  5 07:16:35 np0005546909 systemd[1]: var-lib-containers-storage-overlay-c79a167d190ccd5eec04f331a8b94c8e257ebf6ac48b138dc6af8474b8be0235-merged.mount: Deactivated successfully.
Dec  5 07:16:35 np0005546909 podman[240463]: 2025-12-05 12:16:35.61457051 +0000 UTC m=+0.086091810 container cleanup b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:16:35 np0005546909 systemd[1]: libpod-conmon-b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c.scope: Deactivated successfully.
Dec  5 07:16:35 np0005546909 podman[240494]: 2025-12-05 12:16:35.68165763 +0000 UTC m=+0.046723812 container remove b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.686 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[918aa1b4-9e33-4820-9396-9b7b11b400cd]: (4, ('Fri Dec  5 12:16:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c)\nb81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c\nFri Dec  5 12:16:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (b81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c)\nb81853b290644b432f54f10fa1d84f60b4b76b7ac7de8df11fd204f2b787a63c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.689 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3448f8ec-1d9c-49ff-b8d6-189bc08b5b22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.690 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:35 np0005546909 nova_compute[187208]: 2025-12-05 12:16:35.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:35 np0005546909 kernel: tapf9ed41c2-b0: left promiscuous mode
Dec  5 07:16:35 np0005546909 nova_compute[187208]: 2025-12-05 12:16:35.707 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.710 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea667f3-a1a8-49ed-b48d-e12510ac696c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.735 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8b2d1a9d-6416-4ea7-856d-40305d4bce57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.736 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9c03387f-6b43-4c26-8181-18102a6ae2e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.751 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[59cc5c0a-6f69-495b-aec3-8daac0df9529]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436642, 'reachable_time': 44076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240510, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:35 np0005546909 systemd[1]: run-netns-ovnmeta\x2df9ed41c2\x2db085\x2d41ff\x2dac71\x2d6256a4e30e85.mount: Deactivated successfully.
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.756 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:16:35 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:35.757 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[de5deb9e-063b-4820-bebf-83ec00ee0a81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:36 np0005546909 nova_compute[187208]: 2025-12-05 12:16:36.210 187212 DEBUG nova.compute.manager [req-4338c4fa-7d78-4556-b398-73a25c9a489d req-2a17d8fa-2b11-4e83-a7ee-414b57c44f2e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:36 np0005546909 nova_compute[187208]: 2025-12-05 12:16:36.210 187212 DEBUG oslo_concurrency.lockutils [req-4338c4fa-7d78-4556-b398-73a25c9a489d req-2a17d8fa-2b11-4e83-a7ee-414b57c44f2e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:36 np0005546909 nova_compute[187208]: 2025-12-05 12:16:36.211 187212 DEBUG oslo_concurrency.lockutils [req-4338c4fa-7d78-4556-b398-73a25c9a489d req-2a17d8fa-2b11-4e83-a7ee-414b57c44f2e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:36 np0005546909 nova_compute[187208]: 2025-12-05 12:16:36.211 187212 DEBUG oslo_concurrency.lockutils [req-4338c4fa-7d78-4556-b398-73a25c9a489d req-2a17d8fa-2b11-4e83-a7ee-414b57c44f2e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:36 np0005546909 nova_compute[187208]: 2025-12-05 12:16:36.211 187212 DEBUG nova.compute.manager [req-4338c4fa-7d78-4556-b398-73a25c9a489d req-2a17d8fa-2b11-4e83-a7ee-414b57c44f2e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:16:36 np0005546909 nova_compute[187208]: 2025-12-05 12:16:36.212 187212 WARNING nova.compute.manager [req-4338c4fa-7d78-4556-b398-73a25c9a489d req-2a17d8fa-2b11-4e83-a7ee-414b57c44f2e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-unplugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state suspended and task_state None.#033[00m
Dec  5 07:16:37 np0005546909 nova_compute[187208]: 2025-12-05 12:16:37.752 187212 INFO nova.compute.manager [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Resuming#033[00m
Dec  5 07:16:37 np0005546909 nova_compute[187208]: 2025-12-05 12:16:37.753 187212 DEBUG nova.objects.instance [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'flavor' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:37 np0005546909 nova_compute[187208]: 2025-12-05 12:16:37.795 187212 DEBUG oslo_concurrency.lockutils [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:16:37 np0005546909 nova_compute[187208]: 2025-12-05 12:16:37.795 187212 DEBUG oslo_concurrency.lockutils [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquired lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:16:37 np0005546909 nova_compute[187208]: 2025-12-05 12:16:37.796 187212 DEBUG nova.network.neutron [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:16:38 np0005546909 nova_compute[187208]: 2025-12-05 12:16:38.383 187212 DEBUG nova.compute.manager [req-a4b02f30-8ded-4331-b065-6002680c0e9d req-7f176850-83f9-43ce-b42a-87579fb20b72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:38 np0005546909 nova_compute[187208]: 2025-12-05 12:16:38.383 187212 DEBUG oslo_concurrency.lockutils [req-a4b02f30-8ded-4331-b065-6002680c0e9d req-7f176850-83f9-43ce-b42a-87579fb20b72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:38 np0005546909 nova_compute[187208]: 2025-12-05 12:16:38.384 187212 DEBUG oslo_concurrency.lockutils [req-a4b02f30-8ded-4331-b065-6002680c0e9d req-7f176850-83f9-43ce-b42a-87579fb20b72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:38 np0005546909 nova_compute[187208]: 2025-12-05 12:16:38.384 187212 DEBUG oslo_concurrency.lockutils [req-a4b02f30-8ded-4331-b065-6002680c0e9d req-7f176850-83f9-43ce-b42a-87579fb20b72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:38 np0005546909 nova_compute[187208]: 2025-12-05 12:16:38.384 187212 DEBUG nova.compute.manager [req-a4b02f30-8ded-4331-b065-6002680c0e9d req-7f176850-83f9-43ce-b42a-87579fb20b72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:16:38 np0005546909 nova_compute[187208]: 2025-12-05 12:16:38.384 187212 WARNING nova.compute.manager [req-a4b02f30-8ded-4331-b065-6002680c0e9d req-7f176850-83f9-43ce-b42a-87579fb20b72 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state suspended and task_state resuming.#033[00m
Dec  5 07:16:38 np0005546909 nova_compute[187208]: 2025-12-05 12:16:38.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:40 np0005546909 podman[240512]: 2025-12-05 12:16:40.199937082 +0000 UTC m=+0.052607352 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent)
Dec  5 07:16:40 np0005546909 podman[240511]: 2025-12-05 12:16:40.21163203 +0000 UTC m=+0.062125887 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.6, architecture=x86_64, name=ubi9-minimal, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec  5 07:16:40 np0005546909 nova_compute[187208]: 2025-12-05 12:16:40.693 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.593 187212 DEBUG nova.network.neutron [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [{"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.609 187212 DEBUG oslo_concurrency.lockutils [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Releasing lock "refresh_cache-2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.615 187212 DEBUG nova.virt.libvirt.vif [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:16:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.615 187212 DEBUG nova.network.os_vif_util [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.616 187212 DEBUG nova.network.os_vif_util [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.616 187212 DEBUG os_vif [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.617 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.617 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.617 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.620 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.620 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5316adeb-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.621 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5316adeb-5a, col_values=(('external_ids', {'iface-id': '5316adeb-5a49-4a58-b997-f132a083ff13', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:d0:34', 'vm-uuid': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.621 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.621 187212 INFO os_vif [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.644 187212 DEBUG nova.objects.instance [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:41 np0005546909 kernel: tap5316adeb-5a: entered promiscuous mode
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.733 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:41Z|01075|binding|INFO|Claiming lport 5316adeb-5a49-4a58-b997-f132a083ff13 for this chassis.
Dec  5 07:16:41 np0005546909 NetworkManager[55691]: <info>  [1764937001.7346] manager: (tap5316adeb-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/409)
Dec  5 07:16:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:41Z|01076|binding|INFO|5316adeb-5a49-4a58-b997-f132a083ff13: Claiming fa:16:3e:9a:d0:34 10.100.0.5
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.748 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '13', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.748 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:41Z|01077|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 ovn-installed in OVS
Dec  5 07:16:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:41Z|01078|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 up in Southbound
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.749 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.750 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 bound to our chassis#033[00m
Dec  5 07:16:41 np0005546909 nova_compute[187208]: 2025-12-05 12:16:41.751 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.752 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9ed41c2-b085-41ff-ac71-6256a4e30e85#033[00m
Dec  5 07:16:41 np0005546909 systemd-udevd[240565]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.763 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a0af6a18-6ee4-473b-a073-bc864ab3afda]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.764 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9ed41c2-b1 in ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.766 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9ed41c2-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.766 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[54aa0737-f29d-4a5e-b405-7e6f26849f9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.767 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[925c261f-ed48-4559-a2a4-a1b0d250e209]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 NetworkManager[55691]: <info>  [1764937001.7757] device (tap5316adeb-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:16:41 np0005546909 NetworkManager[55691]: <info>  [1764937001.7766] device (tap5316adeb-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.777 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[55f25acd-96b4-406d-9b17-374a46b1d4a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 systemd-machined[153543]: New machine qemu-124-instance-0000005c.
Dec  5 07:16:41 np0005546909 systemd[1]: Started Virtual Machine qemu-124-instance-0000005c.
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.802 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d502a02b-83a8-4d0c-90d4-041537baf74c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.832 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5a7185cd-6437-42b5-9dc3-94ba8e92512c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.838 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[78563484-eb77-4e6a-a5b1-c25ff9aa33e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 NetworkManager[55691]: <info>  [1764937001.8392] manager: (tapf9ed41c2-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/410)
Dec  5 07:16:41 np0005546909 systemd-udevd[240571]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.872 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[56cef919-5842-413b-a820-d8699b7d6c86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.876 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[48daf419-5be1-4a9d-a9ff-d9ce44c6e35d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 NetworkManager[55691]: <info>  [1764937001.8987] device (tapf9ed41c2-b0): carrier: link connected
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.904 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[0c63d93f-3170-4fec-9e1f-2ee90a187a66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.921 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[00fae274-cefd-4409-bf14-0cd9e09b2eff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 297], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438246, 'reachable_time': 32086, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240600, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.937 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c21b901b-fc66-45d9-bba9-865f18c1b3fe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe24:9111'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438246, 'tstamp': 438246}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240601, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.952 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[890f599b-9a32-40fa-9a95-42ebfbbf30f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9ed41c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:24:91:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 297], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438246, 'reachable_time': 32086, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240602, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:41.984 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[dbfab87f-55da-473d-b414-2fc3bfbf078c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.046 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[23991809-4173-46ca-acaf-4577f2b4e316]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.048 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.048 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.048 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9ed41c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:42 np0005546909 kernel: tapf9ed41c2-b0: entered promiscuous mode
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.050 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:42 np0005546909 NetworkManager[55691]: <info>  [1764937002.0515] manager: (tapf9ed41c2-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/411)
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.052 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.053 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9ed41c2-b0, col_values=(('external_ids', {'iface-id': '99a3ea8e-d189-4985-b8f8-a6a58b1de324'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.054 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:42Z|01079|binding|INFO|Releasing lport 99a3ea8e-d189-4985-b8f8-a6a58b1de324 from this chassis (sb_readonly=0)
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.055 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.056 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.056 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef4af92-579b-4858-a604-7959e7bb02de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.057 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/f9ed41c2-b085-41ff-ac71-6256a4e30e85.pid.haproxy
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID f9ed41c2-b085-41ff-ac71-6256a4e30e85
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:16:42 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:42.058 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'env', 'PROCESS_TAG=haproxy-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9ed41c2-b085-41ff-ac71-6256a4e30e85.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.066 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.267 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.268 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937002.266829, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.268 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Started (Lifecycle Event)#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.285 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.295 187212 DEBUG nova.compute.manager [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.295 187212 DEBUG nova.objects.instance [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.299 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.316 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance running successfully.#033[00m
Dec  5 07:16:42 np0005546909 virtqemud[186841]: argument unsupported: QEMU guest agent is not configured
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.320 187212 DEBUG nova.virt.libvirt.guest [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.320 187212 DEBUG nova.compute.manager [None req-aeaca32b-2778-4e89-893a-6f0f8c70ca5e 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.327 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.328 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937002.2768931, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.328 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.352 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.357 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:16:42 np0005546909 nova_compute[187208]: 2025-12-05 12:16:42.380 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Dec  5 07:16:42 np0005546909 podman[240641]: 2025-12-05 12:16:42.429312046 +0000 UTC m=+0.051891491 container create 0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:16:42 np0005546909 systemd[1]: Started libpod-conmon-0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14.scope.
Dec  5 07:16:42 np0005546909 podman[240641]: 2025-12-05 12:16:42.401544654 +0000 UTC m=+0.024124139 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:16:42 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:16:42 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e761d76e7d192eec1d533f773fa97374b2a9afe07eddac794d23014a3065c410/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:16:42 np0005546909 podman[240641]: 2025-12-05 12:16:42.547947847 +0000 UTC m=+0.170527322 container init 0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 07:16:42 np0005546909 podman[240641]: 2025-12-05 12:16:42.554139806 +0000 UTC m=+0.176719251 container start 0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:16:42 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [NOTICE]   (240661) : New worker (240663) forked
Dec  5 07:16:42 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [NOTICE]   (240661) : Loading success.
Dec  5 07:16:43 np0005546909 nova_compute[187208]: 2025-12-05 12:16:43.838 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.377 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.377 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.377 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.378 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.378 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.379 187212 INFO nova.compute.manager [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Terminating instance#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.380 187212 DEBUG nova.compute.manager [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:16:44 np0005546909 kernel: tap5316adeb-5a (unregistering): left promiscuous mode
Dec  5 07:16:44 np0005546909 NetworkManager[55691]: <info>  [1764937004.4010] device (tap5316adeb-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01080|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=0)
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01081|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down in Southbound
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.409 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01082|binding|INFO|Removing iface tap5316adeb-5a ovn-installed in OVS
Dec  5 07:16:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.421 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '13', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:16:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.422 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis#033[00m
Dec  5 07:16:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.424 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.424 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.426 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c28e901e-b859-4d86-a3d5-02cd795b1555]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.426 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 namespace which is not needed anymore#033[00m
Dec  5 07:16:44 np0005546909 systemd[1]: machine-qemu\x2d124\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Dec  5 07:16:44 np0005546909 systemd[1]: machine-qemu\x2d124\x2dinstance\x2d0000005c.scope: Consumed 2.573s CPU time.
Dec  5 07:16:44 np0005546909 systemd-machined[153543]: Machine qemu-124-instance-0000005c terminated.
Dec  5 07:16:44 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [NOTICE]   (240661) : haproxy version is 2.8.14-c23fe91
Dec  5 07:16:44 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [NOTICE]   (240661) : path to executable is /usr/sbin/haproxy
Dec  5 07:16:44 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [WARNING]  (240661) : Exiting Master process...
Dec  5 07:16:44 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [ALERT]    (240661) : Current worker (240663) exited with code 143 (Terminated)
Dec  5 07:16:44 np0005546909 neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85[240657]: [WARNING]  (240661) : All workers exited. Exiting... (0)
Dec  5 07:16:44 np0005546909 systemd[1]: libpod-0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14.scope: Deactivated successfully.
Dec  5 07:16:44 np0005546909 NetworkManager[55691]: <info>  [1764937004.5987] manager: (tap5316adeb-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/412)
Dec  5 07:16:44 np0005546909 kernel: tap5316adeb-5a: entered promiscuous mode
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01083|binding|INFO|Claiming lport 5316adeb-5a49-4a58-b997-f132a083ff13 for this chassis.
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01084|binding|INFO|5316adeb-5a49-4a58-b997-f132a083ff13: Claiming fa:16:3e:9a:d0:34 10.100.0.5
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.600 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:44 np0005546909 kernel: tap5316adeb-5a (unregistering): left promiscuous mode
Dec  5 07:16:44 np0005546909 podman[240694]: 2025-12-05 12:16:44.605448033 +0000 UTC m=+0.082387003 container died 0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:16:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.607 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '13', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01085|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 ovn-installed in OVS
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01086|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 up in Southbound
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01087|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=1)
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01088|if_status|INFO|Dropped 2 log messages in last 292 seconds (most recently, 292 seconds ago) due to excessive rate
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01089|if_status|INFO|Not setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down as sb is readonly
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.622 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01090|binding|INFO|Removing iface tap5316adeb-5a ovn-installed in OVS
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01091|binding|INFO|Releasing lport 5316adeb-5a49-4a58-b997-f132a083ff13 from this chassis (sb_readonly=0)
Dec  5 07:16:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:16:44Z|01092|binding|INFO|Setting lport 5316adeb-5a49-4a58-b997-f132a083ff13 down in Southbound
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.636 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.637 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:d0:34 10.100.0.5'], port_security=['fa:16:3e:9a:d0:34 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75752a4cc8f7487e8dc4440201f894c8', 'neutron:revision_number': '13', 'neutron:security_group_ids': '444da1a6-3846-481d-b069-657b29adba53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.228'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f8b67612-f380-4148-a63f-745ea4d5092a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=5316adeb-5a49-4a58-b997-f132a083ff13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.658 187212 INFO nova.virt.libvirt.driver [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Instance destroyed successfully.#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.658 187212 DEBUG nova.objects.instance [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lazy-loading 'resources' on Instance uuid 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.674 187212 DEBUG nova.virt.libvirt.vif [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-954339420',display_name='tempest-ServerActionsTestJSON-server-954339420',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-954339420',id=92,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjH1MZiUDaj8dBB9QxOwA8yGMJMHE3ww0Db5oZK2qNp/YIE0fRK6iWBXwsZ7q2SOzB8phhq2deN0H07m/PGf5xC4NsUT/B4qrRM8zwjPKCK8h/LUXGjG3N7Qv09hpf60w==',key_name='tempest-keypair-1191480644',keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:13:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='75752a4cc8f7487e8dc4440201f894c8',ramdisk_id='',reservation_id='r-01uoglvo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1748869140',owner_user_name='tempest-ServerActionsTestJSON-1748869140-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:16:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='41799f35c2764b25912247e2e8e2e9c5',uuid=2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.674 187212 DEBUG nova.network.os_vif_util [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converting VIF {"id": "5316adeb-5a49-4a58-b997-f132a083ff13", "address": "fa:16:3e:9a:d0:34", "network": {"id": "f9ed41c2-b085-41ff-ac71-6256a4e30e85", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1915462531-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75752a4cc8f7487e8dc4440201f894c8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5316adeb-5a", "ovs_interfaceid": "5316adeb-5a49-4a58-b997-f132a083ff13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.675 187212 DEBUG nova.network.os_vif_util [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.675 187212 DEBUG os_vif [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.677 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.677 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5316adeb-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.679 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.681 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.684 187212 INFO os_vif [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:d0:34,bridge_name='br-int',has_traffic_filtering=True,id=5316adeb-5a49-4a58-b997-f132a083ff13,network=Network(f9ed41c2-b085-41ff-ac71-6256a4e30e85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5316adeb-5a')#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.685 187212 INFO nova.virt.libvirt.driver [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Deleting instance files /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c_del#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.686 187212 INFO nova.virt.libvirt.driver [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Deletion of /var/lib/nova/instances/2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c_del complete#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.739 187212 INFO nova.compute.manager [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.739 187212 DEBUG oslo.service.loopingcall [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.740 187212 DEBUG nova.compute.manager [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.740 187212 DEBUG nova.network.neutron [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:16:44 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14-userdata-shm.mount: Deactivated successfully.
Dec  5 07:16:44 np0005546909 systemd[1]: var-lib-containers-storage-overlay-e761d76e7d192eec1d533f773fa97374b2a9afe07eddac794d23014a3065c410-merged.mount: Deactivated successfully.
Dec  5 07:16:44 np0005546909 podman[240694]: 2025-12-05 12:16:44.847352757 +0000 UTC m=+0.324291717 container cleanup 0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:16:44 np0005546909 systemd[1]: libpod-conmon-0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14.scope: Deactivated successfully.
Dec  5 07:16:44 np0005546909 podman[240735]: 2025-12-05 12:16:44.9605547 +0000 UTC m=+0.093523165 container remove 0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:16:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.965 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7882fa2d-9b49-40e1-b4ac-09fdc604ceae]: (4, ('Fri Dec  5 12:16:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14)\n0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14\nFri Dec  5 12:16:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 (0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14)\n0db12242463cffc4ca807989257e1733641968c844631aa6392aeaf4e7d2bf14\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.967 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec64088-b19a-4f44-b4ec-b4241b629e98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.968 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9ed41c2-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.971 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:44 np0005546909 kernel: tapf9ed41c2-b0: left promiscuous mode
Dec  5 07:16:44 np0005546909 nova_compute[187208]: 2025-12-05 12:16:44.984 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:44.987 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[987f076c-5bd1-454a-b806-339f43aab4a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.003 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d4caf7e3-dee0-4c99-87f0-8b74c1cb8a89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.005 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3561387f-7c50-48d7-8d56-9f69b393288d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.027 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9dbaa5a0-5e75-461b-b213-7f144554b378]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438239, 'reachable_time': 15412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240748, 'error': None, 'target': 'ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.030 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9ed41c2-b085-41ff-ac71-6256a4e30e85 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:16:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.030 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a9a5633b-c884-44c2-98e7-96e3d699b034]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.030 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis#033[00m
Dec  5 07:16:45 np0005546909 systemd[1]: run-netns-ovnmeta\x2df9ed41c2\x2db085\x2d41ff\x2dac71\x2d6256a4e30e85.mount: Deactivated successfully.
Dec  5 07:16:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.031 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:16:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.032 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b9cc72-535d-4bd1-96fd-fe7e68ee84e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.033 104471 INFO neutron.agent.ovn.metadata.agent [-] Port 5316adeb-5a49-4a58-b997-f132a083ff13 in datapath f9ed41c2-b085-41ff-ac71-6256a4e30e85 unbound from our chassis#033[00m
Dec  5 07:16:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.034 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9ed41c2-b085-41ff-ac71-6256a4e30e85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:16:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:45.035 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[001e5d16-86cf-49a6-9189-0131c4483b8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:16:46 np0005546909 nova_compute[187208]: 2025-12-05 12:16:46.110 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:46 np0005546909 nova_compute[187208]: 2025-12-05 12:16:46.263 187212 DEBUG nova.compute.manager [req-54e376c3-01f7-41d3-b9a5-abf7e3d49e6f req-de9e1261-e38b-410b-a6b7-2479f65778cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:46 np0005546909 nova_compute[187208]: 2025-12-05 12:16:46.263 187212 DEBUG oslo_concurrency.lockutils [req-54e376c3-01f7-41d3-b9a5-abf7e3d49e6f req-de9e1261-e38b-410b-a6b7-2479f65778cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:46 np0005546909 nova_compute[187208]: 2025-12-05 12:16:46.263 187212 DEBUG oslo_concurrency.lockutils [req-54e376c3-01f7-41d3-b9a5-abf7e3d49e6f req-de9e1261-e38b-410b-a6b7-2479f65778cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:46 np0005546909 nova_compute[187208]: 2025-12-05 12:16:46.263 187212 DEBUG oslo_concurrency.lockutils [req-54e376c3-01f7-41d3-b9a5-abf7e3d49e6f req-de9e1261-e38b-410b-a6b7-2479f65778cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:46 np0005546909 nova_compute[187208]: 2025-12-05 12:16:46.263 187212 DEBUG nova.compute.manager [req-54e376c3-01f7-41d3-b9a5-abf7e3d49e6f req-de9e1261-e38b-410b-a6b7-2479f65778cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:16:46 np0005546909 nova_compute[187208]: 2025-12-05 12:16:46.264 187212 WARNING nova.compute.manager [req-54e376c3-01f7-41d3-b9a5-abf7e3d49e6f req-de9e1261-e38b-410b-a6b7-2479f65778cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.270 187212 DEBUG nova.network.neutron [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.303 187212 INFO nova.compute.manager [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Took 3.56 seconds to deallocate network for instance.#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.345 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.345 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.376 187212 DEBUG nova.compute.manager [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.377 187212 DEBUG oslo_concurrency.lockutils [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.377 187212 DEBUG oslo_concurrency.lockutils [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.377 187212 DEBUG oslo_concurrency.lockutils [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.378 187212 DEBUG nova.compute.manager [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.378 187212 WARNING nova.compute.manager [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.378 187212 DEBUG nova.compute.manager [req-87e14b64-4e21-492f-bb46-3f32afc3b3c6 req-9fb50fdf-bf3b-42a2-b54a-d6ebeb36381a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-deleted-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.407 187212 DEBUG nova.compute.provider_tree [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.422 187212 DEBUG nova.scheduler.client.report [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.444 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.469 187212 INFO nova.scheduler.client.report [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Deleted allocations for instance 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.618 187212 DEBUG oslo_concurrency.lockutils [None req-141993e5-772e-415e-865e-c609246e2a45 41799f35c2764b25912247e2e8e2e9c5 75752a4cc8f7487e8dc4440201f894c8 - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:48 np0005546909 nova_compute[187208]: 2025-12-05 12:16:48.841 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:49 np0005546909 nova_compute[187208]: 2025-12-05 12:16:49.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:16:49 np0005546909 nova_compute[187208]: 2025-12-05 12:16:49.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:16:49 np0005546909 nova_compute[187208]: 2025-12-05 12:16:49.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:16:49 np0005546909 nova_compute[187208]: 2025-12-05 12:16:49.680 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:50 np0005546909 podman[240755]: 2025-12-05 12:16:50.213205623 +0000 UTC m=+0.058020119 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:16:50 np0005546909 podman[240754]: 2025-12-05 12:16:50.217476756 +0000 UTC m=+0.060628764 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:16:50 np0005546909 podman[240756]: 2025-12-05 12:16:50.261350855 +0000 UTC m=+0.098477298 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:16:50 np0005546909 nova_compute[187208]: 2025-12-05 12:16:50.635 187212 DEBUG nova.compute.manager [req-976206bf-a901-49ea-818c-cc10e6b26873 req-1719818a-b327-4fc1-be17-950130849875 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:16:50 np0005546909 nova_compute[187208]: 2025-12-05 12:16:50.636 187212 DEBUG oslo_concurrency.lockutils [req-976206bf-a901-49ea-818c-cc10e6b26873 req-1719818a-b327-4fc1-be17-950130849875 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:50 np0005546909 nova_compute[187208]: 2025-12-05 12:16:50.636 187212 DEBUG oslo_concurrency.lockutils [req-976206bf-a901-49ea-818c-cc10e6b26873 req-1719818a-b327-4fc1-be17-950130849875 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:50 np0005546909 nova_compute[187208]: 2025-12-05 12:16:50.636 187212 DEBUG oslo_concurrency.lockutils [req-976206bf-a901-49ea-818c-cc10e6b26873 req-1719818a-b327-4fc1-be17-950130849875 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:50 np0005546909 nova_compute[187208]: 2025-12-05 12:16:50.637 187212 DEBUG nova.compute.manager [req-976206bf-a901-49ea-818c-cc10e6b26873 req-1719818a-b327-4fc1-be17-950130849875 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] No waiting events found dispatching network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:16:50 np0005546909 nova_compute[187208]: 2025-12-05 12:16:50.637 187212 WARNING nova.compute.manager [req-976206bf-a901-49ea-818c-cc10e6b26873 req-1719818a-b327-4fc1-be17-950130849875 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Received unexpected event network-vif-plugged-5316adeb-5a49-4a58-b997-f132a083ff13 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:16:51 np0005546909 nova_compute[187208]: 2025-12-05 12:16:51.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:16:51 np0005546909 nova_compute[187208]: 2025-12-05 12:16:51.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:16:52 np0005546909 nova_compute[187208]: 2025-12-05 12:16:52.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:16:53 np0005546909 nova_compute[187208]: 2025-12-05 12:16:53.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:16:53 np0005546909 nova_compute[187208]: 2025-12-05 12:16:53.843 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:54 np0005546909 nova_compute[187208]: 2025-12-05 12:16:54.274 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:54 np0005546909 nova_compute[187208]: 2025-12-05 12:16:54.682 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:55.472 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:16:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:16:55.473 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:16:55 np0005546909 nova_compute[187208]: 2025-12-05 12:16:55.473 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:55 np0005546909 nova_compute[187208]: 2025-12-05 12:16:55.718 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:55 np0005546909 nova_compute[187208]: 2025-12-05 12:16:55.841 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.084 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.084 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.085 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.085 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.264 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.266 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5501MB free_disk=73.04064559936523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.267 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.267 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.343 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.343 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.378 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.396 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.422 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:16:56 np0005546909 nova_compute[187208]: 2025-12-05 12:16:56.422 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:16:58 np0005546909 nova_compute[187208]: 2025-12-05 12:16:58.845 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:16:59 np0005546909 nova_compute[187208]: 2025-12-05 12:16:59.658 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937004.656347, 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:16:59 np0005546909 nova_compute[187208]: 2025-12-05 12:16:59.658 187212 INFO nova.compute.manager [-] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:16:59 np0005546909 nova_compute[187208]: 2025-12-05 12:16:59.676 187212 DEBUG nova.compute.manager [None req-1c89c431-4828-46c7-b5ea-1a1384ec34e9 - - - - - -] [instance: 2ea56cbb-4f69-464c-b7f4-fecc5b9a1d4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:16:59 np0005546909 nova_compute[187208]: 2025-12-05 12:16:59.684 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:01 np0005546909 podman[240823]: 2025-12-05 12:17:01.197477618 +0000 UTC m=+0.055606639 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:17:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:03.021 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:03.022 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:03.022 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:03 np0005546909 nova_compute[187208]: 2025-12-05 12:17:03.847 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:04.475 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:17:04 np0005546909 nova_compute[187208]: 2025-12-05 12:17:04.686 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:06 np0005546909 podman[240847]: 2025-12-05 12:17:06.203157771 +0000 UTC m=+0.060023757 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 07:17:08 np0005546909 nova_compute[187208]: 2025-12-05 12:17:08.849 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:09 np0005546909 nova_compute[187208]: 2025-12-05 12:17:09.688 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:11 np0005546909 podman[240868]: 2025-12-05 12:17:11.200245065 +0000 UTC m=+0.058722309 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, name=ubi9-minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=edpm)
Dec  5 07:17:11 np0005546909 podman[240869]: 2025-12-05 12:17:11.217948657 +0000 UTC m=+0.067642507 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 07:17:13 np0005546909 nova_compute[187208]: 2025-12-05 12:17:13.852 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:14 np0005546909 nova_compute[187208]: 2025-12-05 12:17:14.690 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:18 np0005546909 nova_compute[187208]: 2025-12-05 12:17:18.854 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:19 np0005546909 nova_compute[187208]: 2025-12-05 12:17:19.693 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:21 np0005546909 podman[240909]: 2025-12-05 12:17:21.20091074 +0000 UTC m=+0.052679758 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd)
Dec  5 07:17:21 np0005546909 podman[240910]: 2025-12-05 12:17:21.209008882 +0000 UTC m=+0.055563251 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 07:17:21 np0005546909 podman[240911]: 2025-12-05 12:17:21.242782558 +0000 UTC m=+0.086186586 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 07:17:23 np0005546909 nova_compute[187208]: 2025-12-05 12:17:23.858 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:24 np0005546909 nova_compute[187208]: 2025-12-05 12:17:24.697 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:28 np0005546909 nova_compute[187208]: 2025-12-05 12:17:28.859 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:29 np0005546909 nova_compute[187208]: 2025-12-05 12:17:29.732 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:32 np0005546909 podman[240978]: 2025-12-05 12:17:32.201060147 +0000 UTC m=+0.053645026 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:17:33 np0005546909 nova_compute[187208]: 2025-12-05 12:17:33.861 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.356 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.356 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.377 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.453 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.454 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.462 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.463 187212 INFO nova.compute.claims [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.599 187212 DEBUG nova.compute.provider_tree [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.616 187212 DEBUG nova.scheduler.client.report [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.642 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.643 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.734 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.913 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.914 187212 DEBUG nova.network.neutron [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.939 187212 INFO nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:17:34 np0005546909 nova_compute[187208]: 2025-12-05 12:17:34.964 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.107 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.109 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.110 187212 INFO nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Creating image(s)#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.112 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "/var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.113 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "/var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.114 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "/var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.129 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.195 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.196 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.197 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.213 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.275 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.276 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.312 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.313 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.313 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.370 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.372 187212 DEBUG nova.virt.disk.api [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Checking if we can resize image /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.372 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.447 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.449 187212 DEBUG nova.virt.disk.api [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Cannot resize image /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.449 187212 DEBUG nova.objects.instance [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lazy-loading 'migration_context' on Instance uuid 01eab75c-0be7-4ae5-8946-99edd40a7231 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.464 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.465 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Ensure instance console log exists: /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.465 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.466 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:35 np0005546909 nova_compute[187208]: 2025-12-05 12:17:35.466 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:36 np0005546909 nova_compute[187208]: 2025-12-05 12:17:36.024 187212 DEBUG nova.policy [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9db950f394294957891a245f192c5404', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7d9f0915cbe24ecfae713c84ca158d2c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:17:37 np0005546909 podman[241018]: 2025-12-05 12:17:37.219567659 +0000 UTC m=+0.067613236 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 07:17:38 np0005546909 nova_compute[187208]: 2025-12-05 12:17:38.117 187212 DEBUG nova.network.neutron [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Successfully created port: c3bc0e34-ce29-4ea4-b0cb-f46472e25593 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:17:38 np0005546909 nova_compute[187208]: 2025-12-05 12:17:38.896 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:39 np0005546909 nova_compute[187208]: 2025-12-05 12:17:39.737 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:40 np0005546909 nova_compute[187208]: 2025-12-05 12:17:40.876 187212 DEBUG nova.network.neutron [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Successfully updated port: c3bc0e34-ce29-4ea4-b0cb-f46472e25593 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:17:40 np0005546909 nova_compute[187208]: 2025-12-05 12:17:40.893 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:17:40 np0005546909 nova_compute[187208]: 2025-12-05 12:17:40.893 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquired lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:17:40 np0005546909 nova_compute[187208]: 2025-12-05 12:17:40.894 187212 DEBUG nova.network.neutron [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:17:41 np0005546909 nova_compute[187208]: 2025-12-05 12:17:41.060 187212 DEBUG nova.compute.manager [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-changed-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:17:41 np0005546909 nova_compute[187208]: 2025-12-05 12:17:41.061 187212 DEBUG nova.compute.manager [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Refreshing instance network info cache due to event network-changed-c3bc0e34-ce29-4ea4-b0cb-f46472e25593. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:17:41 np0005546909 nova_compute[187208]: 2025-12-05 12:17:41.061 187212 DEBUG oslo_concurrency.lockutils [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:17:41 np0005546909 nova_compute[187208]: 2025-12-05 12:17:41.203 187212 DEBUG nova.network.neutron [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:17:42 np0005546909 ovn_controller[95610]: 2025-12-05T12:17:42Z|01093|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec  5 07:17:42 np0005546909 podman[241044]: 2025-12-05 12:17:42.195322156 +0000 UTC m=+0.048706914 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 07:17:42 np0005546909 podman[241043]: 2025-12-05 12:17:42.206006302 +0000 UTC m=+0.064480306 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350)
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.378 187212 DEBUG nova.network.neutron [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Updating instance_info_cache with network_info: [{"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.398 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Releasing lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.399 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Instance network_info: |[{"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.399 187212 DEBUG oslo_concurrency.lockutils [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.399 187212 DEBUG nova.network.neutron [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Refreshing network info cache for port c3bc0e34-ce29-4ea4-b0cb-f46472e25593 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.402 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Start _get_guest_xml network_info=[{"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.408 187212 WARNING nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.413 187212 DEBUG nova.virt.libvirt.host [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.413 187212 DEBUG nova.virt.libvirt.host [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.416 187212 DEBUG nova.virt.libvirt.host [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.417 187212 DEBUG nova.virt.libvirt.host [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.417 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.417 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.418 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.418 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.418 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.418 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.419 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.419 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.419 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.419 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.420 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.420 187212 DEBUG nova.virt.hardware [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.424 187212 DEBUG nova.virt.libvirt.vif [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:17:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1871572941',display_name='tempest-VolumesActionsTest-instance-1871572941',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-1871572941',id=100,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d9f0915cbe24ecfae713c84ca158d2c',ramdisk_id='',reservation_id='r-wqaeiphu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1144255362',owner_user_name='tempest-VolumesActionsTest-1144255362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:17:35Z,user_data=None,user_id='9db950f394294957891a245f192c5404',uuid=01eab75c-0be7-4ae5-8946-99edd40a7231,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.424 187212 DEBUG nova.network.os_vif_util [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Converting VIF {"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.425 187212 DEBUG nova.network.os_vif_util [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.425 187212 DEBUG nova.objects.instance [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lazy-loading 'pci_devices' on Instance uuid 01eab75c-0be7-4ae5-8946-99edd40a7231 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.441 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  <uuid>01eab75c-0be7-4ae5-8946-99edd40a7231</uuid>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  <name>instance-00000064</name>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <nova:name>tempest-VolumesActionsTest-instance-1871572941</nova:name>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:17:43</nova:creationTime>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:        <nova:user uuid="9db950f394294957891a245f192c5404">tempest-VolumesActionsTest-1144255362-project-member</nova:user>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:        <nova:project uuid="7d9f0915cbe24ecfae713c84ca158d2c">tempest-VolumesActionsTest-1144255362</nova:project>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:        <nova:port uuid="c3bc0e34-ce29-4ea4-b0cb-f46472e25593">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <entry name="serial">01eab75c-0be7-4ae5-8946-99edd40a7231</entry>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <entry name="uuid">01eab75c-0be7-4ae5-8946-99edd40a7231</entry>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.config"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:7d:f3:92"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <target dev="tapc3bc0e34-ce"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/console.log" append="off"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:17:43 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:17:43 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:17:43 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:17:43 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.443 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Preparing to wait for external event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.443 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.443 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.443 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.444 187212 DEBUG nova.virt.libvirt.vif [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:17:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1871572941',display_name='tempest-VolumesActionsTest-instance-1871572941',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-1871572941',id=100,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d9f0915cbe24ecfae713c84ca158d2c',ramdisk_id='',reservation_id='r-wqaeiphu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1144255362',owner_user_name='tempest-VolumesActionsTest-1144255362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:17:35Z,user_data=None,user_id='9db950f394294957891a245f192c5404',uuid=01eab75c-0be7-4ae5-8946-99edd40a7231,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.444 187212 DEBUG nova.network.os_vif_util [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Converting VIF {"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.445 187212 DEBUG nova.network.os_vif_util [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.445 187212 DEBUG os_vif [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.446 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.446 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.447 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.450 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.450 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3bc0e34-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.451 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc3bc0e34-ce, col_values=(('external_ids', {'iface-id': 'c3bc0e34-ce29-4ea4-b0cb-f46472e25593', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:f3:92', 'vm-uuid': '01eab75c-0be7-4ae5-8946-99edd40a7231'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.452 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:43 np0005546909 NetworkManager[55691]: <info>  [1764937063.4539] manager: (tapc3bc0e34-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/413)
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.455 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.458 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.459 187212 INFO os_vif [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce')#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.522 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.522 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.522 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] No VIF found with MAC fa:16:3e:7d:f3:92, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.523 187212 INFO nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Using config drive#033[00m
Dec  5 07:17:43 np0005546909 nova_compute[187208]: 2025-12-05 12:17:43.897 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.403 187212 INFO nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Creating config drive at /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.config#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.408 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptf1cj36w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.537 187212 DEBUG oslo_concurrency.processutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptf1cj36w" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:44 np0005546909 kernel: tapc3bc0e34-ce: entered promiscuous mode
Dec  5 07:17:44 np0005546909 NetworkManager[55691]: <info>  [1764937064.6135] manager: (tapc3bc0e34-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/414)
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.613 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:17:44Z|01094|binding|INFO|Claiming lport c3bc0e34-ce29-4ea4-b0cb-f46472e25593 for this chassis.
Dec  5 07:17:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:17:44Z|01095|binding|INFO|c3bc0e34-ce29-4ea4-b0cb-f46472e25593: Claiming fa:16:3e:7d:f3:92 10.100.0.6
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.617 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.633 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:f3:92 10.100.0.6'], port_security=['fa:16:3e:7d:f3:92 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '01eab75c-0be7-4ae5-8946-99edd40a7231', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d9f0915cbe24ecfae713c84ca158d2c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2eb3347a-3ab9-4f98-aee7-0fd84b6b9272', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c88129a4-38c8-4910-a8a3-2a71bf90c0f0, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c3bc0e34-ce29-4ea4-b0cb-f46472e25593) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.634 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c3bc0e34-ce29-4ea4-b0cb-f46472e25593 in datapath d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 bound to our chassis#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.636 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2#033[00m
Dec  5 07:17:44 np0005546909 systemd-udevd[241103]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.649 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8e230793-be19-4988-988d-80f6ce06f28c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.650 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd34ebc66-b1 in ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:17:44 np0005546909 NetworkManager[55691]: <info>  [1764937064.6523] device (tapc3bc0e34-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:17:44 np0005546909 NetworkManager[55691]: <info>  [1764937064.6529] device (tapc3bc0e34-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.652 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd34ebc66-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.652 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e98cda9c-b968-4d66-adb5-c95bd458305a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.656 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[69d8231a-433d-4074-93da-75aec0a3d044]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 systemd-machined[153543]: New machine qemu-125-instance-00000064.
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.668 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[994b177c-c35b-4c0a-8780-79cdb2c0baf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.675 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:17:44Z|01096|binding|INFO|Setting lport c3bc0e34-ce29-4ea4-b0cb-f46472e25593 ovn-installed in OVS
Dec  5 07:17:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:17:44Z|01097|binding|INFO|Setting lport c3bc0e34-ce29-4ea4-b0cb-f46472e25593 up in Southbound
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.681 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:44 np0005546909 systemd[1]: Started Virtual Machine qemu-125-instance-00000064.
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.691 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b7669bcf-3d10-4da4-be42-c5e3401b5c00]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.720 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[30e82bc6-d472-4b65-b155-361b33d90f1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 NetworkManager[55691]: <info>  [1764937064.7268] manager: (tapd34ebc66-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/415)
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.725 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb73988-df9e-404f-bc3a-4789bde9c300]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 systemd-udevd[241108]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.755 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[9a1238e3-550d-4b6c-9e1b-dab5776708f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.758 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[60c76786-7534-4f4e-b908-b2a0443f2e19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 NetworkManager[55691]: <info>  [1764937064.7793] device (tapd34ebc66-b0): carrier: link connected
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.785 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[f153683d-9afd-4b2e-99a2-e9e610b19334]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.801 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bba6bb69-d8de-4e0d-b70b-5573897eb0c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd34ebc66-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:32:33:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 300], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444534, 'reachable_time': 29614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241139, 'error': None, 'target': 'ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.824 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[663eb7f0-1993-4af7-9150-964564df9b91]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe32:330f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444534, 'tstamp': 444534}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241144, 'error': None, 'target': 'ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.843 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[051d1bc6-a247-45fa-8508-87b40321badd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd34ebc66-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:32:33:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 300], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444534, 'reachable_time': 29614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241147, 'error': None, 'target': 'ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.888 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1870e749-0827-4e55-b8db-63c1b891a41b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.898 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937064.8977919, 01eab75c-0be7-4ae5-8946-99edd40a7231 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.899 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] VM Started (Lifecycle Event)#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.924 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.929 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937064.8980498, 01eab75c-0be7-4ae5-8946-99edd40a7231 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.929 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.947 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.952 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.969 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.969 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[58be1696-5cfe-45af-be38-560a9989a32e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.971 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd34ebc66-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.971 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.972 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd34ebc66-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.973 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:44 np0005546909 NetworkManager[55691]: <info>  [1764937064.9741] manager: (tapd34ebc66-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/416)
Dec  5 07:17:44 np0005546909 kernel: tapd34ebc66-b0: entered promiscuous mode
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.978 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd34ebc66-b0, col_values=(('external_ids', {'iface-id': '05d66759-c75e-464f-b667-95d69ddda49c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.979 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:17:44Z|01098|binding|INFO|Releasing lport 05d66759-c75e-464f-b667-95d69ddda49c from this chassis (sb_readonly=0)
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.982 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.983 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[acc57a29-9093-47ef-abaf-4a18ea6c17b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.983 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2.pid.haproxy
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:17:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:44.984 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'env', 'PROCESS_TAG=haproxy-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:17:44 np0005546909 nova_compute[187208]: 2025-12-05 12:17:44.992 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:45 np0005546909 podman[241180]: 2025-12-05 12:17:45.365233504 +0000 UTC m=+0.057248379 container create ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec  5 07:17:45 np0005546909 systemd[1]: Started libpod-conmon-ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344.scope.
Dec  5 07:17:45 np0005546909 podman[241180]: 2025-12-05 12:17:45.33119584 +0000 UTC m=+0.023210725 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:17:45 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:17:45 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2037bdf28dc96dbeeb4800ea4a3a93355da1c29dfccc04e7f21f81071e995ca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:17:45 np0005546909 podman[241180]: 2025-12-05 12:17:45.460482849 +0000 UTC m=+0.152497744 container init ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec  5 07:17:45 np0005546909 podman[241180]: 2025-12-05 12:17:45.466734298 +0000 UTC m=+0.158749163 container start ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  5 07:17:45 np0005546909 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [NOTICE]   (241199) : New worker (241201) forked
Dec  5 07:17:45 np0005546909 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [NOTICE]   (241199) : Loading success.
Dec  5 07:17:47 np0005546909 nova_compute[187208]: 2025-12-05 12:17:47.166 187212 DEBUG nova.network.neutron [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Updated VIF entry in instance network info cache for port c3bc0e34-ce29-4ea4-b0cb-f46472e25593. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:17:47 np0005546909 nova_compute[187208]: 2025-12-05 12:17:47.167 187212 DEBUG nova.network.neutron [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Updating instance_info_cache with network_info: [{"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:17:47 np0005546909 nova_compute[187208]: 2025-12-05 12:17:47.202 187212 DEBUG oslo_concurrency.lockutils [req-997268c3-af37-4a64-ae7e-cc13f9d347fa req-e1147f97-4116-4f38-9f7a-f7ca4a8adbb4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.270 187212 DEBUG nova.compute.manager [req-7802abce-da55-49aa-902c-69b84a4b2414 req-b869fce2-8c56-4367-b587-e79264597163 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.271 187212 DEBUG oslo_concurrency.lockutils [req-7802abce-da55-49aa-902c-69b84a4b2414 req-b869fce2-8c56-4367-b587-e79264597163 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.274 187212 DEBUG oslo_concurrency.lockutils [req-7802abce-da55-49aa-902c-69b84a4b2414 req-b869fce2-8c56-4367-b587-e79264597163 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.274 187212 DEBUG oslo_concurrency.lockutils [req-7802abce-da55-49aa-902c-69b84a4b2414 req-b869fce2-8c56-4367-b587-e79264597163 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.274 187212 DEBUG nova.compute.manager [req-7802abce-da55-49aa-902c-69b84a4b2414 req-b869fce2-8c56-4367-b587-e79264597163 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Processing event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.275 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.279 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937068.2794023, 01eab75c-0be7-4ae5-8946-99edd40a7231 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.280 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.282 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.286 187212 INFO nova.virt.libvirt.driver [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Instance spawned successfully.#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.286 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.326 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.332 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.332 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.333 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.333 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.333 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.334 187212 DEBUG nova.virt.libvirt.driver [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.338 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.381 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.422 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.423 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.443 187212 INFO nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Took 13.33 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.444 187212 DEBUG nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.453 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.568 187212 INFO nova.compute.manager [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Took 14.15 seconds to build instance.#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.580 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.581 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.686 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.693 187212 DEBUG oslo_concurrency.lockutils [None req-2bb0437c-70a9-4c92-966b-1c2139c7aabb 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.789 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.790 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.816 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.817 187212 INFO nova.compute.claims [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:17:48 np0005546909 nova_compute[187208]: 2025-12-05 12:17:48.900 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.038 187212 DEBUG nova.compute.provider_tree [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.056 187212 DEBUG nova.scheduler.client.report [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.086 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.090 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.114 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "35c29d67-cc5d-4530-ab3c-1a002a162fdb" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.115 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "35c29d67-cc5d-4530-ab3c-1a002a162fdb" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.125 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "35c29d67-cc5d-4530-ab3c-1a002a162fdb" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.126 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.178 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.179 187212 DEBUG nova.network.neutron [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.205 187212 INFO nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.229 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.364 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.366 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.366 187212 INFO nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Creating image(s)#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.367 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "/var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.368 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "/var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.369 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "/var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.381 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.381 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.382 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.382 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 01eab75c-0be7-4ae5-8946-99edd40a7231 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.384 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.447 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.448 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.449 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.461 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.519 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.520 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.894 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk 1073741824" returned: 0 in 0.374s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.894 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.446s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.895 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.937 187212 DEBUG nova.policy [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '698ee3761ad948dca92f44ac1749fd10', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58210cf112da477fa142779ffcbe2b11', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.965 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.968 187212 DEBUG nova.virt.disk.api [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Checking if we can resize image /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:17:49 np0005546909 nova_compute[187208]: 2025-12-05 12:17:49.970 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:50 np0005546909 nova_compute[187208]: 2025-12-05 12:17:50.045 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:50 np0005546909 nova_compute[187208]: 2025-12-05 12:17:50.047 187212 DEBUG nova.virt.disk.api [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Cannot resize image /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:17:50 np0005546909 nova_compute[187208]: 2025-12-05 12:17:50.049 187212 DEBUG nova.objects.instance [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lazy-loading 'migration_context' on Instance uuid c8b0c32f-8175-42fc-834d-a65de5b28996 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:17:50 np0005546909 nova_compute[187208]: 2025-12-05 12:17:50.706 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:17:50 np0005546909 nova_compute[187208]: 2025-12-05 12:17:50.706 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Ensure instance console log exists: /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:17:50 np0005546909 nova_compute[187208]: 2025-12-05 12:17:50.707 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:50 np0005546909 nova_compute[187208]: 2025-12-05 12:17:50.707 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:50 np0005546909 nova_compute[187208]: 2025-12-05 12:17:50.707 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:52 np0005546909 podman[241229]: 2025-12-05 12:17:52.212865514 +0000 UTC m=+0.061083539 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 07:17:52 np0005546909 podman[241228]: 2025-12-05 12:17:52.238783315 +0000 UTC m=+0.088056200 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec  5 07:17:52 np0005546909 podman[241230]: 2025-12-05 12:17:52.298505954 +0000 UTC m=+0.110790030 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Dec  5 07:17:53 np0005546909 nova_compute[187208]: 2025-12-05 12:17:53.336 187212 DEBUG nova.compute.manager [req-50a4e79e-96d7-4f2b-b656-4c474df6781a req-ae9c15c5-7712-42cc-b29a-bf8a7b13cd36 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:17:53 np0005546909 nova_compute[187208]: 2025-12-05 12:17:53.337 187212 DEBUG oslo_concurrency.lockutils [req-50a4e79e-96d7-4f2b-b656-4c474df6781a req-ae9c15c5-7712-42cc-b29a-bf8a7b13cd36 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:53 np0005546909 nova_compute[187208]: 2025-12-05 12:17:53.337 187212 DEBUG oslo_concurrency.lockutils [req-50a4e79e-96d7-4f2b-b656-4c474df6781a req-ae9c15c5-7712-42cc-b29a-bf8a7b13cd36 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:53 np0005546909 nova_compute[187208]: 2025-12-05 12:17:53.338 187212 DEBUG oslo_concurrency.lockutils [req-50a4e79e-96d7-4f2b-b656-4c474df6781a req-ae9c15c5-7712-42cc-b29a-bf8a7b13cd36 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:53 np0005546909 nova_compute[187208]: 2025-12-05 12:17:53.338 187212 DEBUG nova.compute.manager [req-50a4e79e-96d7-4f2b-b656-4c474df6781a req-ae9c15c5-7712-42cc-b29a-bf8a7b13cd36 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] No waiting events found dispatching network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:17:53 np0005546909 nova_compute[187208]: 2025-12-05 12:17:53.338 187212 WARNING nova.compute.manager [req-50a4e79e-96d7-4f2b-b656-4c474df6781a req-ae9c15c5-7712-42cc-b29a-bf8a7b13cd36 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received unexpected event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:17:53 np0005546909 nova_compute[187208]: 2025-12-05 12:17:53.497 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:53 np0005546909 nova_compute[187208]: 2025-12-05 12:17:53.902 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:55 np0005546909 nova_compute[187208]: 2025-12-05 12:17:55.054 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Updating instance_info_cache with network_info: [{"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:17:55 np0005546909 nova_compute[187208]: 2025-12-05 12:17:55.067 187212 DEBUG nova.network.neutron [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Successfully created port: e59d2789-96ad-4740-8d45-d90c6b6f60ca _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:17:55 np0005546909 nova_compute[187208]: 2025-12-05 12:17:55.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-01eab75c-0be7-4ae5-8946-99edd40a7231" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:17:55 np0005546909 nova_compute[187208]: 2025-12-05 12:17:55.090 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:17:55 np0005546909 nova_compute[187208]: 2025-12-05 12:17:55.091 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:17:55 np0005546909 nova_compute[187208]: 2025-12-05 12:17:55.091 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:17:55 np0005546909 nova_compute[187208]: 2025-12-05 12:17:55.092 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:17:55 np0005546909 nova_compute[187208]: 2025-12-05 12:17:55.092 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:17:55 np0005546909 nova_compute[187208]: 2025-12-05 12:17:55.092 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:17:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:55.550 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:17:55 np0005546909 nova_compute[187208]: 2025-12-05 12:17:55.551 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:55.552 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.124 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.125 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.125 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.125 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.212 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.236 187212 DEBUG nova.network.neutron [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Successfully updated port: e59d2789-96ad-4740-8d45-d90c6b6f60ca _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.254 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "refresh_cache-c8b0c32f-8175-42fc-834d-a65de5b28996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.254 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquired lock "refresh_cache-c8b0c32f-8175-42fc-834d-a65de5b28996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.254 187212 DEBUG nova.network.neutron [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.275 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.275 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.341 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.473 187212 DEBUG nova.compute.manager [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-changed-e59d2789-96ad-4740-8d45-d90c6b6f60ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.474 187212 DEBUG nova.compute.manager [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Refreshing instance network info cache due to event network-changed-e59d2789-96ad-4740-8d45-d90c6b6f60ca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.474 187212 DEBUG oslo_concurrency.lockutils [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-c8b0c32f-8175-42fc-834d-a65de5b28996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.490 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.491 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5471MB free_disk=73.03964614868164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.492 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:17:56 np0005546909 nova_compute[187208]: 2025-12-05 12:17:56.492 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:17:58 np0005546909 nova_compute[187208]: 2025-12-05 12:17:58.501 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:17:58.553 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:17:58 np0005546909 nova_compute[187208]: 2025-12-05 12:17:58.731 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 01eab75c-0be7-4ae5-8946-99edd40a7231 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:17:58 np0005546909 nova_compute[187208]: 2025-12-05 12:17:58.732 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance c8b0c32f-8175-42fc-834d-a65de5b28996 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:17:58 np0005546909 nova_compute[187208]: 2025-12-05 12:17:58.732 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:17:58 np0005546909 nova_compute[187208]: 2025-12-05 12:17:58.732 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:17:58 np0005546909 nova_compute[187208]: 2025-12-05 12:17:58.753 187212 DEBUG nova.network.neutron [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:17:58 np0005546909 nova_compute[187208]: 2025-12-05 12:17:58.868 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:17:58 np0005546909 nova_compute[187208]: 2025-12-05 12:17:58.895 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:17:58 np0005546909 nova_compute[187208]: 2025-12-05 12:17:58.903 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:17:58 np0005546909 nova_compute[187208]: 2025-12-05 12:17:58.947 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:17:58 np0005546909 nova_compute[187208]: 2025-12-05 12:17:58.947 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:17:59 np0005546909 nova_compute[187208]: 2025-12-05 12:17:59.942 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:17:59 np0005546909 nova_compute[187208]: 2025-12-05 12:17:59.943 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.736 187212 DEBUG nova.network.neutron [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Updating instance_info_cache with network_info: [{"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.907 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Releasing lock "refresh_cache-c8b0c32f-8175-42fc-834d-a65de5b28996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.907 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Instance network_info: |[{"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.908 187212 DEBUG oslo_concurrency.lockutils [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-c8b0c32f-8175-42fc-834d-a65de5b28996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.908 187212 DEBUG nova.network.neutron [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Refreshing network info cache for port e59d2789-96ad-4740-8d45-d90c6b6f60ca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.911 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Start _get_guest_xml network_info=[{"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.915 187212 WARNING nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.919 187212 DEBUG nova.virt.libvirt.host [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.919 187212 DEBUG nova.virt.libvirt.host [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.924 187212 DEBUG nova.virt.libvirt.host [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.925 187212 DEBUG nova.virt.libvirt.host [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.926 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.926 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.926 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.927 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.927 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.927 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.928 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.928 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.928 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.929 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.929 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.929 187212 DEBUG nova.virt.hardware [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.934 187212 DEBUG nova.virt.libvirt.vif [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:17:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-113501820',display_name='tempest-ServerGroupTestJSON-server-113501820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-113501820',id=101,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58210cf112da477fa142779ffcbe2b11',ramdisk_id='',reservation_id='r-xwaofgm0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-305842052',owner_user_name='tempest-ServerGroupTestJSON-305842052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:17:49Z,user_data=None,user_id='698ee3761ad948dca92f44ac1749fd10',uuid=c8b0c32f-8175-42fc-834d-a65de5b28996,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.934 187212 DEBUG nova.network.os_vif_util [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Converting VIF {"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.935 187212 DEBUG nova.network.os_vif_util [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.936 187212 DEBUG nova.objects.instance [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lazy-loading 'pci_devices' on Instance uuid c8b0c32f-8175-42fc-834d-a65de5b28996 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.952 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  <uuid>c8b0c32f-8175-42fc-834d-a65de5b28996</uuid>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  <name>instance-00000065</name>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerGroupTestJSON-server-113501820</nova:name>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:18:00</nova:creationTime>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:        <nova:user uuid="698ee3761ad948dca92f44ac1749fd10">tempest-ServerGroupTestJSON-305842052-project-member</nova:user>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:        <nova:project uuid="58210cf112da477fa142779ffcbe2b11">tempest-ServerGroupTestJSON-305842052</nova:project>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:        <nova:port uuid="e59d2789-96ad-4740-8d45-d90c6b6f60ca">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <entry name="serial">c8b0c32f-8175-42fc-834d-a65de5b28996</entry>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <entry name="uuid">c8b0c32f-8175-42fc-834d-a65de5b28996</entry>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.config"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:ab:ce:a4"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <target dev="tape59d2789-96"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/console.log" append="off"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:18:00 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:18:00 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:18:00 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:18:00 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.953 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Preparing to wait for external event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.953 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.954 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.954 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.955 187212 DEBUG nova.virt.libvirt.vif [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:17:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-113501820',display_name='tempest-ServerGroupTestJSON-server-113501820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-113501820',id=101,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='58210cf112da477fa142779ffcbe2b11',ramdisk_id='',reservation_id='r-xwaofgm0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerGroupTestJSON-305842052',owner_user_name='tempest-ServerGroupTestJSON-305842052-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:17:49Z,user_data=None,user_id='698ee3761ad948dca92f44ac1749fd10',uuid=c8b0c32f-8175-42fc-834d-a65de5b28996,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.955 187212 DEBUG nova.network.os_vif_util [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Converting VIF {"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.956 187212 DEBUG nova.network.os_vif_util [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.956 187212 DEBUG os_vif [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.956 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.957 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.957 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.960 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.961 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape59d2789-96, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.961 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape59d2789-96, col_values=(('external_ids', {'iface-id': 'e59d2789-96ad-4740-8d45-d90c6b6f60ca', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:ce:a4', 'vm-uuid': 'c8b0c32f-8175-42fc-834d-a65de5b28996'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.962 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:00 np0005546909 NetworkManager[55691]: <info>  [1764937080.9635] manager: (tape59d2789-96): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/417)
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.965 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.970 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:00 np0005546909 nova_compute[187208]: 2025-12-05 12:18:00.971 187212 INFO os_vif [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96')#033[00m
Dec  5 07:18:01 np0005546909 nova_compute[187208]: 2025-12-05 12:18:01.039 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:18:01 np0005546909 nova_compute[187208]: 2025-12-05 12:18:01.040 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:18:01 np0005546909 nova_compute[187208]: 2025-12-05 12:18:01.041 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] No VIF found with MAC fa:16:3e:ab:ce:a4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:18:01 np0005546909 nova_compute[187208]: 2025-12-05 12:18:01.041 187212 INFO nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Using config drive#033[00m
Dec  5 07:18:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:01Z|00123|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7d:f3:92 10.100.0.6
Dec  5 07:18:01 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:01Z|00124|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7d:f3:92 10.100.0.6
Dec  5 07:18:02 np0005546909 nova_compute[187208]: 2025-12-05 12:18:02.107 187212 INFO nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Creating config drive at /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.config#033[00m
Dec  5 07:18:02 np0005546909 nova_compute[187208]: 2025-12-05 12:18:02.112 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptgdzbqqj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:02 np0005546909 nova_compute[187208]: 2025-12-05 12:18:02.247 187212 DEBUG oslo_concurrency.processutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptgdzbqqj" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:02 np0005546909 kernel: tape59d2789-96: entered promiscuous mode
Dec  5 07:18:02 np0005546909 NetworkManager[55691]: <info>  [1764937082.3238] manager: (tape59d2789-96): new Tun device (/org/freedesktop/NetworkManager/Devices/418)
Dec  5 07:18:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:02Z|01099|binding|INFO|Claiming lport e59d2789-96ad-4740-8d45-d90c6b6f60ca for this chassis.
Dec  5 07:18:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:02Z|01100|binding|INFO|e59d2789-96ad-4740-8d45-d90c6b6f60ca: Claiming fa:16:3e:ab:ce:a4 10.100.0.9
Dec  5 07:18:02 np0005546909 nova_compute[187208]: 2025-12-05 12:18:02.345 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.358 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:ce:a4 10.100.0.9'], port_security=['fa:16:3e:ab:ce:a4 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c8b0c32f-8175-42fc-834d-a65de5b28996', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58210cf112da477fa142779ffcbe2b11', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a0c658e5-9568-4f6f-9218-9e1f4aa6f42f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51692341-ed5e-46b2-ae59-906d4f1865f2, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e59d2789-96ad-4740-8d45-d90c6b6f60ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.359 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e59d2789-96ad-4740-8d45-d90c6b6f60ca in datapath cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 bound to our chassis#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.376 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.390 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c73a7a-bd24-40ab-80c7-bfe3ad5bf51d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.391 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcf3ac8ba-01 in ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.394 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcf3ac8ba-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.394 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4a033502-84da-4a2f-a3d0-1cc962b1ff93]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.395 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[520e1e23-07ac-4757-9881-283602144d0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 systemd-udevd[241349]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:18:02 np0005546909 systemd-machined[153543]: New machine qemu-126-instance-00000065.
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.409 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[792ebb2d-c552-4f7e-835b-2915fefc25c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 NetworkManager[55691]: <info>  [1764937082.4164] device (tape59d2789-96): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:18:02 np0005546909 nova_compute[187208]: 2025-12-05 12:18:02.415 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:02 np0005546909 NetworkManager[55691]: <info>  [1764937082.4176] device (tape59d2789-96): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:18:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:02Z|01101|binding|INFO|Setting lport e59d2789-96ad-4740-8d45-d90c6b6f60ca ovn-installed in OVS
Dec  5 07:18:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:02Z|01102|binding|INFO|Setting lport e59d2789-96ad-4740-8d45-d90c6b6f60ca up in Southbound
Dec  5 07:18:02 np0005546909 nova_compute[187208]: 2025-12-05 12:18:02.421 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:02 np0005546909 systemd[1]: Started Virtual Machine qemu-126-instance-00000065.
Dec  5 07:18:02 np0005546909 podman[241330]: 2025-12-05 12:18:02.437282238 +0000 UTC m=+0.116073012 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.441 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[98142ce8-2da3-42cd-9de2-cdf608e9d863]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.474 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[abe76a51-cd7c-43db-a65a-f69e9df7ed97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.480 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[709e8302-2561-47a5-b9bc-90eb6b40ab4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 NetworkManager[55691]: <info>  [1764937082.4827] manager: (tapcf3ac8ba-00): new Veth device (/org/freedesktop/NetworkManager/Devices/419)
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.513 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2975f109-ab01-4dff-9fba-21526504ee8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.516 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cee9a66e-081a-4ef2-82ab-4a44750c4d52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 NetworkManager[55691]: <info>  [1764937082.5392] device (tapcf3ac8ba-00): carrier: link connected
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.544 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[cb928103-19c8-49e4-aaef-2fa81e4c69bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.561 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[da8d8282-5cda-48b8-a027-20dbaa5bcb70]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcf3ac8ba-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:2b:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 302], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446310, 'reachable_time': 28332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241393, 'error': None, 'target': 'ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.579 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c2818cb8-4833-4919-869d-ccbd0a81e18f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe64:2b73'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446310, 'tstamp': 446310}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241394, 'error': None, 'target': 'ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.598 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[61090776-580f-4ac1-917d-6626f739b115]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcf3ac8ba-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:2b:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 302], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446310, 'reachable_time': 28332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241395, 'error': None, 'target': 'ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.631 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8944fd0f-00db-467b-81c5-e1b0116b12b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.682 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e49b7e0d-a721-4164-9f2b-25240100ff58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.683 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcf3ac8ba-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.684 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.684 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcf3ac8ba-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:02 np0005546909 NetworkManager[55691]: <info>  [1764937082.6873] manager: (tapcf3ac8ba-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/420)
Dec  5 07:18:02 np0005546909 nova_compute[187208]: 2025-12-05 12:18:02.686 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:02 np0005546909 kernel: tapcf3ac8ba-00: entered promiscuous mode
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.690 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcf3ac8ba-00, col_values=(('external_ids', {'iface-id': '23b92e92-fbf6-41c6-bdbf-c1326bdf4966'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:02 np0005546909 nova_compute[187208]: 2025-12-05 12:18:02.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:02 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:02Z|01103|binding|INFO|Releasing lport 23b92e92-fbf6-41c6-bdbf-c1326bdf4966 from this chassis (sb_readonly=0)
Dec  5 07:18:02 np0005546909 nova_compute[187208]: 2025-12-05 12:18:02.715 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.716 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.717 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[317b6a02-7863-467f-a2a9-064f5287305b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.718 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713.pid.haproxy
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:18:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:02.718 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'env', 'PROCESS_TAG=haproxy-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:18:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.022 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.023 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.023 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:03 np0005546909 podman[241427]: 2025-12-05 12:18:03.115495071 +0000 UTC m=+0.108899637 container create 53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:18:03 np0005546909 podman[241427]: 2025-12-05 12:18:03.031492888 +0000 UTC m=+0.024897474 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:18:03 np0005546909 systemd[1]: Started libpod-conmon-53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee.scope.
Dec  5 07:18:03 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:18:03 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ffdba3fb53dbf39c5a535cbfc3cbd3cb85b10a5a7c9e9d67175042707b1883c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.201 187212 DEBUG nova.compute.manager [req-ee62a304-7d3c-4bac-8900-f69f1247ea04 req-9de6bdec-09ce-455f-96cb-a61e0fd1c41a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.202 187212 DEBUG oslo_concurrency.lockutils [req-ee62a304-7d3c-4bac-8900-f69f1247ea04 req-9de6bdec-09ce-455f-96cb-a61e0fd1c41a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.203 187212 DEBUG oslo_concurrency.lockutils [req-ee62a304-7d3c-4bac-8900-f69f1247ea04 req-9de6bdec-09ce-455f-96cb-a61e0fd1c41a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.203 187212 DEBUG oslo_concurrency.lockutils [req-ee62a304-7d3c-4bac-8900-f69f1247ea04 req-9de6bdec-09ce-455f-96cb-a61e0fd1c41a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.203 187212 DEBUG nova.compute.manager [req-ee62a304-7d3c-4bac-8900-f69f1247ea04 req-9de6bdec-09ce-455f-96cb-a61e0fd1c41a 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Processing event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:18:03 np0005546909 podman[241427]: 2025-12-05 12:18:03.315294427 +0000 UTC m=+0.308699033 container init 53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:18:03 np0005546909 podman[241427]: 2025-12-05 12:18:03.327584308 +0000 UTC m=+0.320988894 container start 53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.328 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937083.3279169, c8b0c32f-8175-42fc-834d-a65de5b28996 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.329 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] VM Started (Lifecycle Event)#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.333 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.338 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.344 187212 INFO nova.virt.libvirt.driver [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Instance spawned successfully.#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.345 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.354 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.358 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:18:03 np0005546909 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [NOTICE]   (241454) : New worker (241456) forked
Dec  5 07:18:03 np0005546909 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [NOTICE]   (241454) : Loading success.
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.372 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.373 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.374 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.374 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.375 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.375 187212 DEBUG nova.virt.libvirt.driver [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.385 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.386 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937083.3282623, c8b0c32f-8175-42fc-834d-a65de5b28996 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.386 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.424 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.428 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937083.3368895, c8b0c32f-8175-42fc-834d-a65de5b28996 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.429 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.459 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.463 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.469 187212 INFO nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Took 14.10 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.469 187212 DEBUG nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.499 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.550 187212 INFO nova.compute.manager [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Took 14.79 seconds to build instance.#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.568 187212 DEBUG oslo_concurrency.lockutils [None req-2b1d9469-7f01-4ce3-815b-29e2851d351a 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.987s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.674 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.676 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.676 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.676 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.677 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.678 187212 INFO nova.compute.manager [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Terminating instance#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.679 187212 DEBUG nova.compute.manager [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:18:03 np0005546909 kernel: tapc3bc0e34-ce (unregistering): left promiscuous mode
Dec  5 07:18:03 np0005546909 NetworkManager[55691]: <info>  [1764937083.6991] device (tapc3bc0e34-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:18:03 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:03Z|01104|binding|INFO|Releasing lport c3bc0e34-ce29-4ea4-b0cb-f46472e25593 from this chassis (sb_readonly=0)
Dec  5 07:18:03 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:03Z|01105|binding|INFO|Setting lport c3bc0e34-ce29-4ea4-b0cb-f46472e25593 down in Southbound
Dec  5 07:18:03 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:03Z|01106|binding|INFO|Removing iface tapc3bc0e34-ce ovn-installed in OVS
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.705 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.709 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.713 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:f3:92 10.100.0.6'], port_security=['fa:16:3e:7d:f3:92 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '01eab75c-0be7-4ae5-8946-99edd40a7231', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d9f0915cbe24ecfae713c84ca158d2c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2eb3347a-3ab9-4f98-aee7-0fd84b6b9272', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c88129a4-38c8-4910-a8a3-2a71bf90c0f0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=c3bc0e34-ce29-4ea4-b0cb-f46472e25593) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:18:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.714 104471 INFO neutron.agent.ovn.metadata.agent [-] Port c3bc0e34-ce29-4ea4-b0cb-f46472e25593 in datapath d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 unbound from our chassis#033[00m
Dec  5 07:18:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.716 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:18:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.718 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[de8f16d3-b36c-4de3-908d-234000a2ed6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:03.719 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 namespace which is not needed anymore#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.724 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:03 np0005546909 systemd[1]: machine-qemu\x2d125\x2dinstance\x2d00000064.scope: Deactivated successfully.
Dec  5 07:18:03 np0005546909 systemd[1]: machine-qemu\x2d125\x2dinstance\x2d00000064.scope: Consumed 12.163s CPU time.
Dec  5 07:18:03 np0005546909 systemd-machined[153543]: Machine qemu-125-instance-00000064 terminated.
Dec  5 07:18:03 np0005546909 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [NOTICE]   (241199) : haproxy version is 2.8.14-c23fe91
Dec  5 07:18:03 np0005546909 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [NOTICE]   (241199) : path to executable is /usr/sbin/haproxy
Dec  5 07:18:03 np0005546909 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [WARNING]  (241199) : Exiting Master process...
Dec  5 07:18:03 np0005546909 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [ALERT]    (241199) : Current worker (241201) exited with code 143 (Terminated)
Dec  5 07:18:03 np0005546909 neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2[241195]: [WARNING]  (241199) : All workers exited. Exiting... (0)
Dec  5 07:18:03 np0005546909 systemd[1]: libpod-ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344.scope: Deactivated successfully.
Dec  5 07:18:03 np0005546909 podman[241487]: 2025-12-05 12:18:03.878239452 +0000 UTC m=+0.059598216 container died ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:18:03 np0005546909 NetworkManager[55691]: <info>  [1764937083.9033] manager: (tapc3bc0e34-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/421)
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.903 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.911 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.938 187212 INFO nova.virt.libvirt.driver [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Instance destroyed successfully.#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.939 187212 DEBUG nova.objects.instance [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lazy-loading 'resources' on Instance uuid 01eab75c-0be7-4ae5-8946-99edd40a7231 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.952 187212 DEBUG nova.virt.libvirt.vif [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:17:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1871572941',display_name='tempest-VolumesActionsTest-instance-1871572941',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-1871572941',id=100,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:17:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7d9f0915cbe24ecfae713c84ca158d2c',ramdisk_id='',reservation_id='r-wqaeiphu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-VolumesActionsTest-1144255362',owner_user_name='tempest-VolumesActionsTest-1144255362-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:17:48Z,user_data=None,user_id='9db950f394294957891a245f192c5404',uuid=01eab75c-0be7-4ae5-8946-99edd40a7231,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.952 187212 DEBUG nova.network.os_vif_util [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Converting VIF {"id": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "address": "fa:16:3e:7d:f3:92", "network": {"id": "d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1487460645-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7d9f0915cbe24ecfae713c84ca158d2c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3bc0e34-ce", "ovs_interfaceid": "c3bc0e34-ce29-4ea4-b0cb-f46472e25593", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.953 187212 DEBUG nova.network.os_vif_util [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.954 187212 DEBUG os_vif [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.957 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.958 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3bc0e34-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.959 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.961 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.963 187212 INFO os_vif [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:f3:92,bridge_name='br-int',has_traffic_filtering=True,id=c3bc0e34-ce29-4ea4-b0cb-f46472e25593,network=Network(d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3bc0e34-ce')#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.964 187212 INFO nova.virt.libvirt.driver [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Deleting instance files /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231_del#033[00m
Dec  5 07:18:03 np0005546909 nova_compute[187208]: 2025-12-05 12:18:03.965 187212 INFO nova.virt.libvirt.driver [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Deletion of /var/lib/nova/instances/01eab75c-0be7-4ae5-8946-99edd40a7231_del complete#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.029 187212 INFO nova.compute.manager [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.030 187212 DEBUG oslo.service.loopingcall [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.031 187212 DEBUG nova.compute.manager [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.031 187212 DEBUG nova.network.neutron [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:18:04 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344-userdata-shm.mount: Deactivated successfully.
Dec  5 07:18:04 np0005546909 systemd[1]: var-lib-containers-storage-overlay-e2037bdf28dc96dbeeb4800ea4a3a93355da1c29dfccc04e7f21f81071e995ca-merged.mount: Deactivated successfully.
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.215 187212 DEBUG nova.compute.manager [req-1b822cbd-2e4a-4411-9e8d-1bb9d6625681 req-a5a9f32d-2f3b-4ea4-ae67-3f847c4ec8d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-vif-unplugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.216 187212 DEBUG oslo_concurrency.lockutils [req-1b822cbd-2e4a-4411-9e8d-1bb9d6625681 req-a5a9f32d-2f3b-4ea4-ae67-3f847c4ec8d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.217 187212 DEBUG oslo_concurrency.lockutils [req-1b822cbd-2e4a-4411-9e8d-1bb9d6625681 req-a5a9f32d-2f3b-4ea4-ae67-3f847c4ec8d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.217 187212 DEBUG oslo_concurrency.lockutils [req-1b822cbd-2e4a-4411-9e8d-1bb9d6625681 req-a5a9f32d-2f3b-4ea4-ae67-3f847c4ec8d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.218 187212 DEBUG nova.compute.manager [req-1b822cbd-2e4a-4411-9e8d-1bb9d6625681 req-a5a9f32d-2f3b-4ea4-ae67-3f847c4ec8d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] No waiting events found dispatching network-vif-unplugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.218 187212 DEBUG nova.compute.manager [req-1b822cbd-2e4a-4411-9e8d-1bb9d6625681 req-a5a9f32d-2f3b-4ea4-ae67-3f847c4ec8d4 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-vif-unplugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:18:04 np0005546909 podman[241487]: 2025-12-05 12:18:04.263000728 +0000 UTC m=+0.444359492 container cleanup ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec  5 07:18:04 np0005546909 systemd[1]: libpod-conmon-ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344.scope: Deactivated successfully.
Dec  5 07:18:04 np0005546909 podman[241532]: 2025-12-05 12:18:04.333509496 +0000 UTC m=+0.045015879 container remove ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 07:18:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.339 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d1793643-7ed6-45d0-8351-7a6186603f21]: (4, ('Fri Dec  5 12:18:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 (ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344)\nca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344\nFri Dec  5 12:18:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 (ca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344)\nca45806e6a95592f67127078876281f97a063f81e992c831712b79d54c4ab344\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.340 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4d595cde-a9cb-4652-9e8a-896a7bdc68e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.341 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd34ebc66-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.343 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:04 np0005546909 kernel: tapd34ebc66-b0: left promiscuous mode
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.345 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.348 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7ecef12c-36c7-4903-8321-046f0c5182b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.359 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.372 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[af550bba-89ab-46f4-92d6-75e7a2d5ab47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.375 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d45799b2-7a70-4e6c-8da2-d5b5dc856993]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.392 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[35cc81f2-4ab3-457d-bb49-791de23154ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444527, 'reachable_time': 29683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241548, 'error': None, 'target': 'ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.395 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d34ebc66-b7e3-4d6f-b6cd-b40947e8fed2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:18:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:04.395 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b4411405-b1a5-4ff2-92fb-71793e9e44cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:04 np0005546909 systemd[1]: run-netns-ovnmeta\x2dd34ebc66\x2db7e3\x2d4d6f\x2db6cd\x2db40947e8fed2.mount: Deactivated successfully.
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.638 187212 DEBUG nova.network.neutron [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Updated VIF entry in instance network info cache for port e59d2789-96ad-4740-8d45-d90c6b6f60ca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.638 187212 DEBUG nova.network.neutron [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Updating instance_info_cache with network_info: [{"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:18:04 np0005546909 nova_compute[187208]: 2025-12-05 12:18:04.655 187212 DEBUG oslo_concurrency.lockutils [req-1eb3b636-6ed3-4b67-8056-7ae995b70505 req-8fe1ef5a-fd10-4b44-86c3-61769ec1d302 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-c8b0c32f-8175-42fc-834d-a65de5b28996" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.327 187212 DEBUG nova.compute.manager [req-1564ead4-3716-42f0-96da-670e825d07b9 req-e920fe83-9fa9-4f32-8825-5a2602ae0d09 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.328 187212 DEBUG oslo_concurrency.lockutils [req-1564ead4-3716-42f0-96da-670e825d07b9 req-e920fe83-9fa9-4f32-8825-5a2602ae0d09 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.329 187212 DEBUG oslo_concurrency.lockutils [req-1564ead4-3716-42f0-96da-670e825d07b9 req-e920fe83-9fa9-4f32-8825-5a2602ae0d09 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.330 187212 DEBUG oslo_concurrency.lockutils [req-1564ead4-3716-42f0-96da-670e825d07b9 req-e920fe83-9fa9-4f32-8825-5a2602ae0d09 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.331 187212 DEBUG nova.compute.manager [req-1564ead4-3716-42f0-96da-670e825d07b9 req-e920fe83-9fa9-4f32-8825-5a2602ae0d09 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] No waiting events found dispatching network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.331 187212 WARNING nova.compute.manager [req-1564ead4-3716-42f0-96da-670e825d07b9 req-e920fe83-9fa9-4f32-8825-5a2602ae0d09 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received unexpected event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca for instance with vm_state active and task_state None.#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.363 187212 DEBUG nova.network.neutron [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.396 187212 INFO nova.compute.manager [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Took 1.37 seconds to deallocate network for instance.#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.459 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.460 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.551 187212 DEBUG nova.compute.provider_tree [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.575 187212 DEBUG nova.scheduler.client.report [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.607 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.636 187212 INFO nova.scheduler.client.report [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Deleted allocations for instance 01eab75c-0be7-4ae5-8946-99edd40a7231#033[00m
Dec  5 07:18:05 np0005546909 nova_compute[187208]: 2025-12-05 12:18:05.753 187212 DEBUG oslo_concurrency.lockutils [None req-acc8b5d1-09e0-4673-9db9-5ddcdde960ee 9db950f394294957891a245f192c5404 7d9f0915cbe24ecfae713c84ca158d2c - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.674 187212 DEBUG nova.compute.manager [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.675 187212 DEBUG oslo_concurrency.lockutils [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.676 187212 DEBUG oslo_concurrency.lockutils [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.676 187212 DEBUG oslo_concurrency.lockutils [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "01eab75c-0be7-4ae5-8946-99edd40a7231-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.677 187212 DEBUG nova.compute.manager [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] No waiting events found dispatching network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.677 187212 WARNING nova.compute.manager [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received unexpected event network-vif-plugged-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.678 187212 DEBUG nova.compute.manager [req-39e69dfe-db43-4bc6-a7fe-fec3aa7c23a7 req-100a81e3-56e8-407d-b5fc-5c36c812dd95 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Received event network-vif-deleted-c3bc0e34-ce29-4ea4-b0cb-f46472e25593 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.714 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.714 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.715 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.715 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.716 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.717 187212 INFO nova.compute.manager [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Terminating instance#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.718 187212 DEBUG nova.compute.manager [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:18:06 np0005546909 kernel: tape59d2789-96 (unregistering): left promiscuous mode
Dec  5 07:18:06 np0005546909 NetworkManager[55691]: <info>  [1764937086.7358] device (tape59d2789-96): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.741 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:06Z|01107|binding|INFO|Releasing lport e59d2789-96ad-4740-8d45-d90c6b6f60ca from this chassis (sb_readonly=0)
Dec  5 07:18:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:06Z|01108|binding|INFO|Setting lport e59d2789-96ad-4740-8d45-d90c6b6f60ca down in Southbound
Dec  5 07:18:06 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:06Z|01109|binding|INFO|Removing iface tape59d2789-96 ovn-installed in OVS
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.743 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:06 np0005546909 nova_compute[187208]: 2025-12-05 12:18:06.755 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:06.752 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:ce:a4 10.100.0.9'], port_security=['fa:16:3e:ab:ce:a4 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'c8b0c32f-8175-42fc-834d-a65de5b28996', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58210cf112da477fa142779ffcbe2b11', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a0c658e5-9568-4f6f-9218-9e1f4aa6f42f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51692341-ed5e-46b2-ae59-906d4f1865f2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=e59d2789-96ad-4740-8d45-d90c6b6f60ca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:18:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:06.754 104471 INFO neutron.agent.ovn.metadata.agent [-] Port e59d2789-96ad-4740-8d45-d90c6b6f60ca in datapath cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 unbound from our chassis#033[00m
Dec  5 07:18:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:06.756 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:18:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:06.757 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[94c510c5-31c9-4718-81e9-bb34471523ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:06.758 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 namespace which is not needed anymore#033[00m
Dec  5 07:18:06 np0005546909 systemd[1]: machine-qemu\x2d126\x2dinstance\x2d00000065.scope: Deactivated successfully.
Dec  5 07:18:06 np0005546909 systemd[1]: machine-qemu\x2d126\x2dinstance\x2d00000065.scope: Consumed 4.272s CPU time.
Dec  5 07:18:06 np0005546909 systemd-machined[153543]: Machine qemu-126-instance-00000065 terminated.
Dec  5 07:18:06 np0005546909 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [NOTICE]   (241454) : haproxy version is 2.8.14-c23fe91
Dec  5 07:18:06 np0005546909 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [NOTICE]   (241454) : path to executable is /usr/sbin/haproxy
Dec  5 07:18:06 np0005546909 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [WARNING]  (241454) : Exiting Master process...
Dec  5 07:18:06 np0005546909 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [ALERT]    (241454) : Current worker (241456) exited with code 143 (Terminated)
Dec  5 07:18:06 np0005546909 neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713[241443]: [WARNING]  (241454) : All workers exited. Exiting... (0)
Dec  5 07:18:06 np0005546909 systemd[1]: libpod-53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee.scope: Deactivated successfully.
Dec  5 07:18:06 np0005546909 podman[241574]: 2025-12-05 12:18:06.884550287 +0000 UTC m=+0.043644309 container died 53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:18:06 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee-userdata-shm.mount: Deactivated successfully.
Dec  5 07:18:06 np0005546909 systemd[1]: var-lib-containers-storage-overlay-9ffdba3fb53dbf39c5a535cbfc3cbd3cb85b10a5a7c9e9d67175042707b1883c-merged.mount: Deactivated successfully.
Dec  5 07:18:06 np0005546909 podman[241574]: 2025-12-05 12:18:06.92834066 +0000 UTC m=+0.087434702 container cleanup 53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 07:18:06 np0005546909 systemd[1]: libpod-conmon-53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee.scope: Deactivated successfully.
Dec  5 07:18:06 np0005546909 podman[241608]: 2025-12-05 12:18:06.994173874 +0000 UTC m=+0.042833837 container remove 53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:18:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:06.998 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d903fd4d-f9fc-48c2-b3e4-109cee1f5b3a]: (4, ('Fri Dec  5 12:18:06 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 (53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee)\n53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee\nFri Dec  5 12:18:06 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 (53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee)\n53b7a71f9dfc30b7a106541ebbf6bad06077dc292e07ec6d64c05ad815069eee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.001 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b2216e9d-0ef8-4140-91cd-2fe9a7c5039b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.002 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcf3ac8ba-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.004 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:07 np0005546909 kernel: tapcf3ac8ba-00: left promiscuous mode
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.009 187212 INFO nova.virt.libvirt.driver [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Instance destroyed successfully.#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.009 187212 DEBUG nova.objects.instance [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lazy-loading 'resources' on Instance uuid c8b0c32f-8175-42fc-834d-a65de5b28996 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.022 187212 DEBUG nova.virt.libvirt.vif [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:17:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerGroupTestJSON-server-113501820',display_name='tempest-ServerGroupTestJSON-server-113501820',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servergrouptestjson-server-113501820',id=101,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:18:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='58210cf112da477fa142779ffcbe2b11',ramdisk_id='',reservation_id='r-xwaofgm0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerGroupTestJSON-305842052',owner_user_name='tempest-ServerGroupTestJSON-305842052-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:18:03Z,user_data=None,user_id='698ee3761ad948dca92f44ac1749fd10',uuid=c8b0c32f-8175-42fc-834d-a65de5b28996,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.023 187212 DEBUG nova.network.os_vif_util [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Converting VIF {"id": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "address": "fa:16:3e:ab:ce:a4", "network": {"id": "cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-171813047-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "58210cf112da477fa142779ffcbe2b11", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape59d2789-96", "ovs_interfaceid": "e59d2789-96ad-4740-8d45-d90c6b6f60ca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.023 187212 DEBUG nova.network.os_vif_util [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.024 187212 DEBUG os_vif [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:18:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.024 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d4bfdc58-c502-43d3-9407-1dc4d92ab399]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.025 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.025 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape59d2789-96, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.026 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.026 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.028 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.032 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.034 187212 INFO os_vif [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:ce:a4,bridge_name='br-int',has_traffic_filtering=True,id=e59d2789-96ad-4740-8d45-d90c6b6f60ca,network=Network(cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape59d2789-96')#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.034 187212 INFO nova.virt.libvirt.driver [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Deleting instance files /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996_del#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.035 187212 INFO nova.virt.libvirt.driver [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Deletion of /var/lib/nova/instances/c8b0c32f-8175-42fc-834d-a65de5b28996_del complete#033[00m
Dec  5 07:18:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.044 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1eeb59a5-79b7-41e2-a191-f69aad1d1aa0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.045 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bf5aeb10-6d03-4124-b705-543b63630992]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.060 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[e6607f9a-588e-4444-b272-532f14b64b60]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446303, 'reachable_time': 34447, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241640, 'error': None, 'target': 'ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:07 np0005546909 systemd[1]: run-netns-ovnmeta\x2dcf3ac8ba\x2d0b95\x2d4da0\x2d8a42\x2da4e8cffa2713.mount: Deactivated successfully.
Dec  5 07:18:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.063 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cf3ac8ba-0b95-4da0-8a42-a4e8cffa2713 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:18:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:07.063 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[86d8e60f-3477-41b4-80fa-6e596930d964]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.092 187212 INFO nova.compute.manager [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.093 187212 DEBUG oslo.service.loopingcall [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.093 187212 DEBUG nova.compute.manager [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:18:07 np0005546909 nova_compute[187208]: 2025-12-05 12:18:07.093 187212 DEBUG nova.network.neutron [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:18:08 np0005546909 podman[241641]: 2025-12-05 12:18:08.206359741 +0000 UTC m=+0.055597781 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.450 187212 DEBUG nova.compute.manager [req-6d7a8143-434c-4120-9652-bc0caf605aba req-d485b0da-3609-4b70-b0cf-e4a9463b6895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-vif-unplugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.451 187212 DEBUG oslo_concurrency.lockutils [req-6d7a8143-434c-4120-9652-bc0caf605aba req-d485b0da-3609-4b70-b0cf-e4a9463b6895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.451 187212 DEBUG oslo_concurrency.lockutils [req-6d7a8143-434c-4120-9652-bc0caf605aba req-d485b0da-3609-4b70-b0cf-e4a9463b6895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.452 187212 DEBUG oslo_concurrency.lockutils [req-6d7a8143-434c-4120-9652-bc0caf605aba req-d485b0da-3609-4b70-b0cf-e4a9463b6895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.452 187212 DEBUG nova.compute.manager [req-6d7a8143-434c-4120-9652-bc0caf605aba req-d485b0da-3609-4b70-b0cf-e4a9463b6895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] No waiting events found dispatching network-vif-unplugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.452 187212 DEBUG nova.compute.manager [req-6d7a8143-434c-4120-9652-bc0caf605aba req-d485b0da-3609-4b70-b0cf-e4a9463b6895 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-vif-unplugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.799 187212 DEBUG nova.network.neutron [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.821 187212 INFO nova.compute.manager [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Took 1.73 seconds to deallocate network for instance.#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.873 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.874 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.914 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.935 187212 DEBUG nova.compute.provider_tree [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.941 187212 DEBUG nova.compute.manager [req-e36d7745-5665-42da-bcc6-3e3ad9a5db06 req-84ff2602-f808-4994-b63c-db5662fcbd69 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-vif-deleted-e59d2789-96ad-4740-8d45-d90c6b6f60ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.950 187212 DEBUG nova.scheduler.client.report [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:18:08 np0005546909 nova_compute[187208]: 2025-12-05 12:18:08.977 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:09 np0005546909 nova_compute[187208]: 2025-12-05 12:18:09.004 187212 INFO nova.scheduler.client.report [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Deleted allocations for instance c8b0c32f-8175-42fc-834d-a65de5b28996#033[00m
Dec  5 07:18:09 np0005546909 nova_compute[187208]: 2025-12-05 12:18:09.082 187212 DEBUG oslo_concurrency.lockutils [None req-3f4905f4-2b5a-468e-8666-abb361873714 698ee3761ad948dca92f44ac1749fd10 58210cf112da477fa142779ffcbe2b11 - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:10 np0005546909 nova_compute[187208]: 2025-12-05 12:18:10.834 187212 DEBUG nova.compute.manager [req-84073dca-a266-48e8-8711-7cd46fab7244 req-e27e773f-8a28-422d-a11f-7f0fa8b9477d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:18:10 np0005546909 nova_compute[187208]: 2025-12-05 12:18:10.835 187212 DEBUG oslo_concurrency.lockutils [req-84073dca-a266-48e8-8711-7cd46fab7244 req-e27e773f-8a28-422d-a11f-7f0fa8b9477d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:10 np0005546909 nova_compute[187208]: 2025-12-05 12:18:10.835 187212 DEBUG oslo_concurrency.lockutils [req-84073dca-a266-48e8-8711-7cd46fab7244 req-e27e773f-8a28-422d-a11f-7f0fa8b9477d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:10 np0005546909 nova_compute[187208]: 2025-12-05 12:18:10.836 187212 DEBUG oslo_concurrency.lockutils [req-84073dca-a266-48e8-8711-7cd46fab7244 req-e27e773f-8a28-422d-a11f-7f0fa8b9477d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "c8b0c32f-8175-42fc-834d-a65de5b28996-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:10 np0005546909 nova_compute[187208]: 2025-12-05 12:18:10.836 187212 DEBUG nova.compute.manager [req-84073dca-a266-48e8-8711-7cd46fab7244 req-e27e773f-8a28-422d-a11f-7f0fa8b9477d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] No waiting events found dispatching network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:18:10 np0005546909 nova_compute[187208]: 2025-12-05 12:18:10.836 187212 WARNING nova.compute.manager [req-84073dca-a266-48e8-8711-7cd46fab7244 req-e27e773f-8a28-422d-a11f-7f0fa8b9477d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Received unexpected event network-vif-plugged-e59d2789-96ad-4740-8d45-d90c6b6f60ca for instance with vm_state deleted and task_state None.#033[00m
Dec  5 07:18:12 np0005546909 nova_compute[187208]: 2025-12-05 12:18:12.027 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:13 np0005546909 podman[241662]: 2025-12-05 12:18:13.209099052 +0000 UTC m=+0.054168921 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  5 07:18:13 np0005546909 podman[241661]: 2025-12-05 12:18:13.209150973 +0000 UTC m=+0.057337471 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec  5 07:18:13 np0005546909 nova_compute[187208]: 2025-12-05 12:18:13.914 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.272 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.495 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "ef9ada46-64bf-4990-954a-cc70a354b443" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.496 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "ef9ada46-64bf-4990-954a-cc70a354b443" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.511 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.596 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.596 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.604 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.605 187212 INFO nova.compute.claims [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.758 187212 DEBUG nova.compute.provider_tree [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.779 187212 DEBUG nova.scheduler.client.report [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.798 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.799 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.839 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.839 187212 DEBUG nova.network.neutron [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.861 187212 INFO nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.882 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.978 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.979 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.980 187212 INFO nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Creating image(s)#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.980 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "/var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.981 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "/var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.982 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "/var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:16 np0005546909 nova_compute[187208]: 2025-12-05 12:18:16.998 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.029 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.063 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.064 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.065 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.081 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.144 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.145 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.196 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.197 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.198 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.254 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.255 187212 DEBUG nova.virt.disk.api [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Checking if we can resize image /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.256 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.319 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.321 187212 DEBUG nova.virt.disk.api [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Cannot resize image /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.321 187212 DEBUG nova.objects.instance [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lazy-loading 'migration_context' on Instance uuid ef9ada46-64bf-4990-954a-cc70a354b443 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.336 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.337 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Ensure instance console log exists: /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.337 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.337 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:17 np0005546909 nova_compute[187208]: 2025-12-05 12:18:17.338 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.184 187212 DEBUG nova.network.neutron [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.185 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.187 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.193 187212 WARNING nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.198 187212 DEBUG nova.virt.libvirt.host [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.199 187212 DEBUG nova.virt.libvirt.host [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.203 187212 DEBUG nova.virt.libvirt.host [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.203 187212 DEBUG nova.virt.libvirt.host [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.204 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.204 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.205 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.205 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.205 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.205 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.206 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.206 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.206 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.206 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.207 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.207 187212 DEBUG nova.virt.hardware [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.211 187212 DEBUG nova.objects.instance [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lazy-loading 'pci_devices' on Instance uuid ef9ada46-64bf-4990-954a-cc70a354b443 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.224 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  <uuid>ef9ada46-64bf-4990-954a-cc70a354b443</uuid>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  <name>instance-00000066</name>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <nova:name>tempest-VolumesNegativeTest-instance-577523945</nova:name>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:18:18</nova:creationTime>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:18:18 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:        <nova:user uuid="50394847033f4123a02f592a98d13f9e">tempest-VolumesNegativeTest-893074152-project-member</nova:user>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:        <nova:project uuid="23d25e1d365b4bca9d6a6e954185bd66">tempest-VolumesNegativeTest-893074152</nova:project>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <entry name="serial">ef9ada46-64bf-4990-954a-cc70a354b443</entry>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <entry name="uuid">ef9ada46-64bf-4990-954a-cc70a354b443</entry>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.config"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/console.log" append="off"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:18:18 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:18:18 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:18:18 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:18:18 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.320 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.322 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.323 187212 INFO nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Using config drive#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.607 187212 INFO nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Creating config drive at /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.config#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.611 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoatpvx3o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.742 187212 DEBUG oslo_concurrency.processutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoatpvx3o" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:18 np0005546909 systemd-machined[153543]: New machine qemu-127-instance-00000066.
Dec  5 07:18:18 np0005546909 systemd[1]: Started Virtual Machine qemu-127-instance-00000066.
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.937 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937083.9363487, 01eab75c-0be7-4ae5-8946-99edd40a7231 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.938 187212 INFO nova.compute.manager [-] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:18:18 np0005546909 nova_compute[187208]: 2025-12-05 12:18:18.973 187212 DEBUG nova.compute.manager [None req-3486363c-0e07-4f65-9c04-2dab29836616 - - - - - -] [instance: 01eab75c-0be7-4ae5-8946-99edd40a7231] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.300 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937099.299712, ef9ada46-64bf-4990-954a-cc70a354b443 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.300 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.302 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.302 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.308 187212 INFO nova.virt.libvirt.driver [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Instance spawned successfully.#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.308 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.332 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.338 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.342 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.342 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.343 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.344 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.344 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.345 187212 DEBUG nova.virt.libvirt.driver [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.378 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.378 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937099.3005576, ef9ada46-64bf-4990-954a-cc70a354b443 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.379 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] VM Started (Lifecycle Event)#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.410 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.413 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.424 187212 INFO nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Took 2.45 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.425 187212 DEBUG nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.437 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.489 187212 INFO nova.compute.manager [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Took 2.93 seconds to build instance.#033[00m
Dec  5 07:18:19 np0005546909 nova_compute[187208]: 2025-12-05 12:18:19.508 187212 DEBUG oslo_concurrency.lockutils [None req-f89f05c2-3609-4013-b656-c275f8dc855f 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "ef9ada46-64bf-4990-954a-cc70a354b443" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:21 np0005546909 nova_compute[187208]: 2025-12-05 12:18:21.142 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "ef9ada46-64bf-4990-954a-cc70a354b443" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:21 np0005546909 nova_compute[187208]: 2025-12-05 12:18:21.143 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "ef9ada46-64bf-4990-954a-cc70a354b443" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:21 np0005546909 nova_compute[187208]: 2025-12-05 12:18:21.143 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "ef9ada46-64bf-4990-954a-cc70a354b443-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:21 np0005546909 nova_compute[187208]: 2025-12-05 12:18:21.144 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "ef9ada46-64bf-4990-954a-cc70a354b443-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:21 np0005546909 nova_compute[187208]: 2025-12-05 12:18:21.144 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "ef9ada46-64bf-4990-954a-cc70a354b443-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:21 np0005546909 nova_compute[187208]: 2025-12-05 12:18:21.145 187212 INFO nova.compute.manager [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Terminating instance#033[00m
Dec  5 07:18:21 np0005546909 nova_compute[187208]: 2025-12-05 12:18:21.146 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "refresh_cache-ef9ada46-64bf-4990-954a-cc70a354b443" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:18:21 np0005546909 nova_compute[187208]: 2025-12-05 12:18:21.146 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquired lock "refresh_cache-ef9ada46-64bf-4990-954a-cc70a354b443" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:18:21 np0005546909 nova_compute[187208]: 2025-12-05 12:18:21.146 187212 DEBUG nova.network.neutron [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:18:21 np0005546909 nova_compute[187208]: 2025-12-05 12:18:21.433 187212 DEBUG nova.network.neutron [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.004 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937087.0036385, c8b0c32f-8175-42fc-834d-a65de5b28996 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.005 187212 INFO nova.compute.manager [-] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.031 187212 DEBUG nova.compute.manager [None req-852a9fe4-7c91-4a2b-bb15-6351b8d2d880 - - - - - -] [instance: c8b0c32f-8175-42fc-834d-a65de5b28996] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.032 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.109 187212 DEBUG nova.network.neutron [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.133 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Releasing lock "refresh_cache-ef9ada46-64bf-4990-954a-cc70a354b443" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.134 187212 DEBUG nova.compute.manager [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:18:22 np0005546909 systemd[1]: machine-qemu\x2d127\x2dinstance\x2d00000066.scope: Deactivated successfully.
Dec  5 07:18:22 np0005546909 systemd[1]: machine-qemu\x2d127\x2dinstance\x2d00000066.scope: Consumed 3.301s CPU time.
Dec  5 07:18:22 np0005546909 systemd-machined[153543]: Machine qemu-127-instance-00000066 terminated.
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.380 187212 INFO nova.virt.libvirt.driver [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Instance destroyed successfully.#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.381 187212 DEBUG nova.objects.instance [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lazy-loading 'resources' on Instance uuid ef9ada46-64bf-4990-954a-cc70a354b443 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.396 187212 INFO nova.virt.libvirt.driver [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Deleting instance files /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443_del#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.397 187212 INFO nova.virt.libvirt.driver [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Deletion of /var/lib/nova/instances/ef9ada46-64bf-4990-954a-cc70a354b443_del complete#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.459 187212 INFO nova.compute.manager [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Took 0.32 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.460 187212 DEBUG oslo.service.loopingcall [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.461 187212 DEBUG nova.compute.manager [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.461 187212 DEBUG nova.network.neutron [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.789 187212 DEBUG nova.network.neutron [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.803 187212 DEBUG nova.network.neutron [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.830 187212 INFO nova.compute.manager [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Took 0.37 seconds to deallocate network for instance.#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.876 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.877 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.936 187212 DEBUG nova.compute.provider_tree [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.956 187212 DEBUG nova.scheduler.client.report [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:18:22 np0005546909 nova_compute[187208]: 2025-12-05 12:18:22.999 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:23 np0005546909 nova_compute[187208]: 2025-12-05 12:18:23.030 187212 INFO nova.scheduler.client.report [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Deleted allocations for instance ef9ada46-64bf-4990-954a-cc70a354b443#033[00m
Dec  5 07:18:23 np0005546909 nova_compute[187208]: 2025-12-05 12:18:23.122 187212 DEBUG oslo_concurrency.lockutils [None req-a57897de-935c-4d63-84d3-787c8822fab4 50394847033f4123a02f592a98d13f9e 23d25e1d365b4bca9d6a6e954185bd66 - - default default] Lock "ef9ada46-64bf-4990-954a-cc70a354b443" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.979s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:23 np0005546909 podman[241754]: 2025-12-05 12:18:23.217063534 +0000 UTC m=+0.059762151 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:18:23 np0005546909 podman[241753]: 2025-12-05 12:18:23.2249657 +0000 UTC m=+0.069486709 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:18:23 np0005546909 podman[241755]: 2025-12-05 12:18:23.266933211 +0000 UTC m=+0.102977207 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:18:23 np0005546909 nova_compute[187208]: 2025-12-05 12:18:23.915 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:27 np0005546909 nova_compute[187208]: 2025-12-05 12:18:27.152 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:28 np0005546909 nova_compute[187208]: 2025-12-05 12:18:28.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:32 np0005546909 nova_compute[187208]: 2025-12-05 12:18:32.153 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:33 np0005546909 podman[241823]: 2025-12-05 12:18:33.203673754 +0000 UTC m=+0.052236725 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:18:33 np0005546909 nova_compute[187208]: 2025-12-05 12:18:33.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:18:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:18:37 np0005546909 nova_compute[187208]: 2025-12-05 12:18:37.155 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:37 np0005546909 nova_compute[187208]: 2025-12-05 12:18:37.379 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937102.3787365, ef9ada46-64bf-4990-954a-cc70a354b443 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:18:37 np0005546909 nova_compute[187208]: 2025-12-05 12:18:37.380 187212 INFO nova.compute.manager [-] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:18:37 np0005546909 nova_compute[187208]: 2025-12-05 12:18:37.400 187212 DEBUG nova.compute.manager [None req-3709be06-a359-4428-b633-cfb2b733b2c8 - - - - - -] [instance: ef9ada46-64bf-4990-954a-cc70a354b443] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:37 np0005546909 nova_compute[187208]: 2025-12-05 12:18:37.995 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:37 np0005546909 nova_compute[187208]: 2025-12-05 12:18:37.996 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.026 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.121 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.121 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.128 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.129 187212 INFO nova.compute.claims [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.235 187212 DEBUG nova.compute.provider_tree [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.257 187212 DEBUG nova.scheduler.client.report [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.301 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.302 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.362 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.363 187212 DEBUG nova.network.neutron [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.384 187212 INFO nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.401 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.535 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.536 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.536 187212 INFO nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Creating image(s)#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.537 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "/var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.537 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "/var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.538 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "/var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.552 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.613 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.614 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.615 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.626 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.691 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.693 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.737 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.738 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.738 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.799 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.800 187212 DEBUG nova.virt.disk.api [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Checking if we can resize image /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.800 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.856 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.857 187212 DEBUG nova.virt.disk.api [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Cannot resize image /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.857 187212 DEBUG nova.objects.instance [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lazy-loading 'migration_context' on Instance uuid b95b2427-7c9a-4d8d-bcfd-645393721cb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.872 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.872 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Ensure instance console log exists: /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.873 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.873 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.874 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:38 np0005546909 nova_compute[187208]: 2025-12-05 12:18:38.921 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:39 np0005546909 podman[241862]: 2025-12-05 12:18:39.221933138 +0000 UTC m=+0.065293179 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Dec  5 07:18:39 np0005546909 nova_compute[187208]: 2025-12-05 12:18:39.564 187212 DEBUG nova.policy [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7907c905947f4a2290c1cb23fc23e453', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '526f66a0e3ca44b097d8ce7f4a763497', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:18:40 np0005546909 nova_compute[187208]: 2025-12-05 12:18:40.736 187212 DEBUG nova.network.neutron [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Successfully created port: a63fc129-9d70-44d9-a73a-cfa00b3264aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:18:41 np0005546909 nova_compute[187208]: 2025-12-05 12:18:41.985 187212 DEBUG nova.network.neutron [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Successfully updated port: a63fc129-9d70-44d9-a73a-cfa00b3264aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:18:42 np0005546909 nova_compute[187208]: 2025-12-05 12:18:42.018 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:18:42 np0005546909 nova_compute[187208]: 2025-12-05 12:18:42.018 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquired lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:18:42 np0005546909 nova_compute[187208]: 2025-12-05 12:18:42.019 187212 DEBUG nova.network.neutron [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:18:42 np0005546909 nova_compute[187208]: 2025-12-05 12:18:42.156 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:42 np0005546909 nova_compute[187208]: 2025-12-05 12:18:42.269 187212 DEBUG nova.compute.manager [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Received event network-changed-a63fc129-9d70-44d9-a73a-cfa00b3264aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:18:42 np0005546909 nova_compute[187208]: 2025-12-05 12:18:42.270 187212 DEBUG nova.compute.manager [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Refreshing instance network info cache due to event network-changed-a63fc129-9d70-44d9-a73a-cfa00b3264aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:18:42 np0005546909 nova_compute[187208]: 2025-12-05 12:18:42.270 187212 DEBUG oslo_concurrency.lockutils [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:18:42 np0005546909 nova_compute[187208]: 2025-12-05 12:18:42.408 187212 DEBUG nova.network.neutron [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:18:43 np0005546909 nova_compute[187208]: 2025-12-05 12:18:43.921 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:44 np0005546909 podman[241883]: 2025-12-05 12:18:44.003277024 +0000 UTC m=+0.052640807 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  5 07:18:44 np0005546909 podman[241882]: 2025-12-05 12:18:44.003283344 +0000 UTC m=+0.058394671 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.026 187212 DEBUG nova.network.neutron [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Updating instance_info_cache with network_info: [{"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.047 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Releasing lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.048 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Instance network_info: |[{"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.048 187212 DEBUG oslo_concurrency.lockutils [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.048 187212 DEBUG nova.network.neutron [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Refreshing network info cache for port a63fc129-9d70-44d9-a73a-cfa00b3264aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.051 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Start _get_guest_xml network_info=[{"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.056 187212 WARNING nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.060 187212 DEBUG nova.virt.libvirt.host [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.060 187212 DEBUG nova.virt.libvirt.host [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.063 187212 DEBUG nova.virt.libvirt.host [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.063 187212 DEBUG nova.virt.libvirt.host [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.064 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.064 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.065 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.065 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.065 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.065 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.065 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.065 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.066 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.066 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.066 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.066 187212 DEBUG nova.virt.hardware [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.070 187212 DEBUG nova.virt.libvirt.vif [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:18:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-257891300',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servertagstestjson-server-257891300',id=103,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='526f66a0e3ca44b097d8ce7f4a763497',ramdisk_id='',reservation_id='r-glcipxwz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-210151535',owner_user_name='tempest-ServerTagsTestJSON-210151535-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:18:38Z,user_data=None,user_id='7907c905947f4a2290c1cb23fc23e453',uuid=b95b2427-7c9a-4d8d-bcfd-645393721cb5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.070 187212 DEBUG nova.network.os_vif_util [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Converting VIF {"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.071 187212 DEBUG nova.network.os_vif_util [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.072 187212 DEBUG nova.objects.instance [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lazy-loading 'pci_devices' on Instance uuid b95b2427-7c9a-4d8d-bcfd-645393721cb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.086 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  <uuid>b95b2427-7c9a-4d8d-bcfd-645393721cb5</uuid>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  <name>instance-00000067</name>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerTagsTestJSON-server-257891300</nova:name>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:18:44</nova:creationTime>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:        <nova:user uuid="7907c905947f4a2290c1cb23fc23e453">tempest-ServerTagsTestJSON-210151535-project-member</nova:user>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:        <nova:project uuid="526f66a0e3ca44b097d8ce7f4a763497">tempest-ServerTagsTestJSON-210151535</nova:project>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:        <nova:port uuid="a63fc129-9d70-44d9-a73a-cfa00b3264aa">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <entry name="serial">b95b2427-7c9a-4d8d-bcfd-645393721cb5</entry>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <entry name="uuid">b95b2427-7c9a-4d8d-bcfd-645393721cb5</entry>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.config"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:b1:b9:03"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <target dev="tapa63fc129-9d"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/console.log" append="off"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:18:44 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:18:44 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:18:44 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:18:44 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.087 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Preparing to wait for external event network-vif-plugged-a63fc129-9d70-44d9-a73a-cfa00b3264aa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.087 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.088 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.088 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.088 187212 DEBUG nova.virt.libvirt.vif [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:18:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-257891300',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servertagstestjson-server-257891300',id=103,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='526f66a0e3ca44b097d8ce7f4a763497',ramdisk_id='',reservation_id='r-glcipxwz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-210151535',owner_user_name='tempest-ServerTagsTestJSON-210151535-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:18:38Z,user_data=None,user_id='7907c905947f4a2290c1cb23fc23e453',uuid=b95b2427-7c9a-4d8d-bcfd-645393721cb5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.089 187212 DEBUG nova.network.os_vif_util [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Converting VIF {"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.089 187212 DEBUG nova.network.os_vif_util [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.090 187212 DEBUG os_vif [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.090 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.090 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.091 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.093 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.093 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa63fc129-9d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.094 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa63fc129-9d, col_values=(('external_ids', {'iface-id': 'a63fc129-9d70-44d9-a73a-cfa00b3264aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:b9:03', 'vm-uuid': 'b95b2427-7c9a-4d8d-bcfd-645393721cb5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.095 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:44 np0005546909 NetworkManager[55691]: <info>  [1764937124.0963] manager: (tapa63fc129-9d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/422)
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.097 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.105 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.106 187212 INFO os_vif [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d')#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.163 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.164 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.164 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] No VIF found with MAC fa:16:3e:b1:b9:03, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.164 187212 INFO nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Using config drive#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.753 187212 INFO nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Creating config drive at /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.config#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.758 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps_j2j4bh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.884 187212 DEBUG oslo_concurrency.processutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps_j2j4bh" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:18:44 np0005546909 kernel: tapa63fc129-9d: entered promiscuous mode
Dec  5 07:18:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:44Z|01110|binding|INFO|Claiming lport a63fc129-9d70-44d9-a73a-cfa00b3264aa for this chassis.
Dec  5 07:18:44 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:44Z|01111|binding|INFO|a63fc129-9d70-44d9-a73a-cfa00b3264aa: Claiming fa:16:3e:b1:b9:03 10.100.0.5
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.971 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:44 np0005546909 NetworkManager[55691]: <info>  [1764937124.9728] manager: (tapa63fc129-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/423)
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.974 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:44 np0005546909 nova_compute[187208]: 2025-12-05 12:18:44.982 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:44.989 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:b9:03 10.100.0.5'], port_security=['fa:16:3e:b1:b9:03 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b95b2427-7c9a-4d8d-bcfd-645393721cb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-331f2e14-0579-41c0-b551-4fc605c604b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '526f66a0e3ca44b097d8ce7f4a763497', 'neutron:revision_number': '2', 'neutron:security_group_ids': '597979d2-7558-443c-8a4c-18c518fb0d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8c94da1-7797-46e4-848a-0d3e3ffcc75d, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=a63fc129-9d70-44d9-a73a-cfa00b3264aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:18:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:44.990 104471 INFO neutron.agent.ovn.metadata.agent [-] Port a63fc129-9d70-44d9-a73a-cfa00b3264aa in datapath 331f2e14-0579-41c0-b551-4fc605c604b5 bound to our chassis#033[00m
Dec  5 07:18:44 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:44.992 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 331f2e14-0579-41c0-b551-4fc605c604b5#033[00m
Dec  5 07:18:45 np0005546909 systemd-udevd[241942]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.004 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4ab0e64c-fe91-4978-90c7-c0fbd508f825]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.005 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap331f2e14-01 in ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.007 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap331f2e14-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.007 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a8cc0bac-4e78-41b7-863f-6f9bbc0fa82d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.008 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4d3afe-4669-44fe-8bdf-e963735c5512]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 NetworkManager[55691]: <info>  [1764937125.0150] device (tapa63fc129-9d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:18:45 np0005546909 NetworkManager[55691]: <info>  [1764937125.0157] device (tapa63fc129-9d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.021 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[02324057-ec33-4cd2-8554-e874ac64d0f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 systemd-machined[153543]: New machine qemu-128-instance-00000067.
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.035 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:45Z|01112|binding|INFO|Setting lport a63fc129-9d70-44d9-a73a-cfa00b3264aa ovn-installed in OVS
Dec  5 07:18:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:45Z|01113|binding|INFO|Setting lport a63fc129-9d70-44d9-a73a-cfa00b3264aa up in Southbound
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.036 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fac599a7-b428-4183-9873-fe87d47c8c0c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.039 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:45 np0005546909 systemd[1]: Started Virtual Machine qemu-128-instance-00000067.
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.066 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b798a01c-0850-4aa2-85b7-903cf3a35c8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 systemd-udevd[241946]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:18:45 np0005546909 NetworkManager[55691]: <info>  [1764937125.0740] manager: (tap331f2e14-00): new Veth device (/org/freedesktop/NetworkManager/Devices/424)
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.073 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ff1c301a-ede3-4385-82c4-55df9e0dddb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.105 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[b98f9753-54ca-49d0-967c-d08779673172]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.109 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0b29f9-da8e-4531-a1ed-c1642a311eac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 NetworkManager[55691]: <info>  [1764937125.1382] device (tap331f2e14-00): carrier: link connected
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.145 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a91cf2-db59-4237-998f-1393c9519edc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.164 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ab39db21-834e-4b3a-ac10-c6fbd52d841b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap331f2e14-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:62:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 306], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450570, 'reachable_time': 19663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241975, 'error': None, 'target': 'ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.184 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[972c96bb-3090-4ef0-a313-431cf156a045]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fead:624a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450570, 'tstamp': 450570}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241981, 'error': None, 'target': 'ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.205 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5aad5bb7-2b84-4cf0-ba5d-157415ca4850]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap331f2e14-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:62:4a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 306], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450570, 'reachable_time': 19663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241983, 'error': None, 'target': 'ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.240 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec44292-dfa0-45a2-a7cc-5f92fd37ac77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.257 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937125.2569408, b95b2427-7c9a-4d8d-bcfd-645393721cb5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.258 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] VM Started (Lifecycle Event)#033[00m
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.279 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.283 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937125.2570643, b95b2427-7c9a-4d8d-bcfd-645393721cb5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.283 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.300 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b166de56-450c-41b2-b64b-2cc52c384a31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.301 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap331f2e14-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.302 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.302 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap331f2e14-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:45 np0005546909 kernel: tap331f2e14-00: entered promiscuous mode
Dec  5 07:18:45 np0005546909 NetworkManager[55691]: <info>  [1764937125.3049] manager: (tap331f2e14-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/425)
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.306 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.306 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap331f2e14-00, col_values=(('external_ids', {'iface-id': 'e7daf13b-4028-44c8-88f6-67c36a959e89'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.307 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:45 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:45Z|01114|binding|INFO|Releasing lport e7daf13b-4028-44c8-88f6-67c36a959e89 from this chassis (sb_readonly=0)
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.309 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.309 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/331f2e14-0579-41c0-b551-4fc605c604b5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/331f2e14-0579-41c0-b551-4fc605c604b5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.310 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[99827112-0c68-4105-9849-7f2b5978ee32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.311 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-331f2e14-0579-41c0-b551-4fc605c604b5
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/331f2e14-0579-41c0-b551-4fc605c604b5.pid.haproxy
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 331f2e14-0579-41c0-b551-4fc605c604b5
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.312 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:18:45 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:45.312 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5', 'env', 'PROCESS_TAG=haproxy-331f2e14-0579-41c0-b551-4fc605c604b5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/331f2e14-0579-41c0-b551-4fc605c604b5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.319 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.336 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:18:45 np0005546909 podman[242016]: 2025-12-05 12:18:45.699566443 +0000 UTC m=+0.054885531 container create 194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec  5 07:18:45 np0005546909 systemd[1]: Started libpod-conmon-194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5.scope.
Dec  5 07:18:45 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:18:45 np0005546909 podman[242016]: 2025-12-05 12:18:45.671120689 +0000 UTC m=+0.026439807 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:18:45 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78c6e2eb3a9f74cb7fddf3d8d585a32734ee1d05235538c68add3f2e68661886/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.826 187212 DEBUG nova.network.neutron [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Updated VIF entry in instance network info cache for port a63fc129-9d70-44d9-a73a-cfa00b3264aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.827 187212 DEBUG nova.network.neutron [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Updating instance_info_cache with network_info: [{"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:18:45 np0005546909 podman[242016]: 2025-12-05 12:18:45.844399867 +0000 UTC m=+0.199718985 container init 194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:18:45 np0005546909 nova_compute[187208]: 2025-12-05 12:18:45.844 187212 DEBUG oslo_concurrency.lockutils [req-7fe4b063-54f3-45cc-91b0-0aae4d636887 req-65459bf9-5da2-4e19-9fc8-8be784821b7d 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:18:45 np0005546909 podman[242016]: 2025-12-05 12:18:45.849827732 +0000 UTC m=+0.205146820 container start 194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:18:45 np0005546909 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [NOTICE]   (242035) : New worker (242037) forked
Dec  5 07:18:45 np0005546909 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [NOTICE]   (242035) : Loading success.
Dec  5 07:18:47 np0005546909 nova_compute[187208]: 2025-12-05 12:18:47.987 187212 DEBUG nova.compute.manager [req-ef5d98b4-8720-4af0-b77a-dddc19511bdd req-b3185e64-e709-467c-81bf-2110314b4493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Received event network-vif-plugged-a63fc129-9d70-44d9-a73a-cfa00b3264aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:18:47 np0005546909 nova_compute[187208]: 2025-12-05 12:18:47.987 187212 DEBUG oslo_concurrency.lockutils [req-ef5d98b4-8720-4af0-b77a-dddc19511bdd req-b3185e64-e709-467c-81bf-2110314b4493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:47 np0005546909 nova_compute[187208]: 2025-12-05 12:18:47.988 187212 DEBUG oslo_concurrency.lockutils [req-ef5d98b4-8720-4af0-b77a-dddc19511bdd req-b3185e64-e709-467c-81bf-2110314b4493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:47 np0005546909 nova_compute[187208]: 2025-12-05 12:18:47.988 187212 DEBUG oslo_concurrency.lockutils [req-ef5d98b4-8720-4af0-b77a-dddc19511bdd req-b3185e64-e709-467c-81bf-2110314b4493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:47 np0005546909 nova_compute[187208]: 2025-12-05 12:18:47.988 187212 DEBUG nova.compute.manager [req-ef5d98b4-8720-4af0-b77a-dddc19511bdd req-b3185e64-e709-467c-81bf-2110314b4493 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Processing event network-vif-plugged-a63fc129-9d70-44d9-a73a-cfa00b3264aa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:18:47 np0005546909 nova_compute[187208]: 2025-12-05 12:18:47.989 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:18:47 np0005546909 nova_compute[187208]: 2025-12-05 12:18:47.992 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937127.9922357, b95b2427-7c9a-4d8d-bcfd-645393721cb5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:18:47 np0005546909 nova_compute[187208]: 2025-12-05 12:18:47.992 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:18:47 np0005546909 nova_compute[187208]: 2025-12-05 12:18:47.994 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:18:47 np0005546909 nova_compute[187208]: 2025-12-05 12:18:47.996 187212 INFO nova.virt.libvirt.driver [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Instance spawned successfully.#033[00m
Dec  5 07:18:47 np0005546909 nova_compute[187208]: 2025-12-05 12:18:47.996 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.017 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.022 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.023 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.023 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.024 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.024 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.025 187212 DEBUG nova.virt.libvirt.driver [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.030 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.069 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.094 187212 INFO nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Took 9.56 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.094 187212 DEBUG nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.161 187212 INFO nova.compute.manager [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Took 10.07 seconds to build instance.#033[00m
Dec  5 07:18:48 np0005546909 nova_compute[187208]: 2025-12-05 12:18:48.176 187212 DEBUG oslo_concurrency.lockutils [None req-da6e3ad1-10cb-49c0-a992-c4fad9c8ee70 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:49 np0005546909 nova_compute[187208]: 2025-12-05 12:18:49.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:18:49 np0005546909 nova_compute[187208]: 2025-12-05 12:18:49.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:18:49 np0005546909 nova_compute[187208]: 2025-12-05 12:18:49.493 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.080 187212 DEBUG nova.compute.manager [req-b6d4ce43-be60-47d8-a687-370935255262 req-5b710ae1-f1f4-421c-a491-ea8891416791 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Received event network-vif-plugged-a63fc129-9d70-44d9-a73a-cfa00b3264aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.080 187212 DEBUG oslo_concurrency.lockutils [req-b6d4ce43-be60-47d8-a687-370935255262 req-5b710ae1-f1f4-421c-a491-ea8891416791 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.080 187212 DEBUG oslo_concurrency.lockutils [req-b6d4ce43-be60-47d8-a687-370935255262 req-5b710ae1-f1f4-421c-a491-ea8891416791 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.080 187212 DEBUG oslo_concurrency.lockutils [req-b6d4ce43-be60-47d8-a687-370935255262 req-5b710ae1-f1f4-421c-a491-ea8891416791 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.081 187212 DEBUG nova.compute.manager [req-b6d4ce43-be60-47d8-a687-370935255262 req-5b710ae1-f1f4-421c-a491-ea8891416791 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] No waiting events found dispatching network-vif-plugged-a63fc129-9d70-44d9-a73a-cfa00b3264aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.081 187212 WARNING nova.compute.manager [req-b6d4ce43-be60-47d8-a687-370935255262 req-5b710ae1-f1f4-421c-a491-ea8891416791 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Received unexpected event network-vif-plugged-a63fc129-9d70-44d9-a73a-cfa00b3264aa for instance with vm_state active and task_state None.#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.611 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.612 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquired lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.612 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  5 07:18:50 np0005546909 nova_compute[187208]: 2025-12-05 12:18:50.612 187212 DEBUG nova.objects.instance [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b95b2427-7c9a-4d8d-bcfd-645393721cb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:18:52 np0005546909 nova_compute[187208]: 2025-12-05 12:18:52.844 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:52 np0005546909 nova_compute[187208]: 2025-12-05 12:18:52.845 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:52 np0005546909 nova_compute[187208]: 2025-12-05 12:18:52.846 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:52 np0005546909 nova_compute[187208]: 2025-12-05 12:18:52.846 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:52 np0005546909 nova_compute[187208]: 2025-12-05 12:18:52.846 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:52 np0005546909 nova_compute[187208]: 2025-12-05 12:18:52.847 187212 INFO nova.compute.manager [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Terminating instance#033[00m
Dec  5 07:18:52 np0005546909 nova_compute[187208]: 2025-12-05 12:18:52.848 187212 DEBUG nova.compute.manager [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:18:52 np0005546909 kernel: tapa63fc129-9d (unregistering): left promiscuous mode
Dec  5 07:18:52 np0005546909 NetworkManager[55691]: <info>  [1764937132.8883] device (tapa63fc129-9d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:18:52 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:52Z|01115|binding|INFO|Releasing lport a63fc129-9d70-44d9-a73a-cfa00b3264aa from this chassis (sb_readonly=0)
Dec  5 07:18:52 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:52Z|01116|binding|INFO|Setting lport a63fc129-9d70-44d9-a73a-cfa00b3264aa down in Southbound
Dec  5 07:18:52 np0005546909 ovn_controller[95610]: 2025-12-05T12:18:52Z|01117|binding|INFO|Removing iface tapa63fc129-9d ovn-installed in OVS
Dec  5 07:18:52 np0005546909 nova_compute[187208]: 2025-12-05 12:18:52.897 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:52 np0005546909 nova_compute[187208]: 2025-12-05 12:18:52.901 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:52.910 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:b9:03 10.100.0.5'], port_security=['fa:16:3e:b1:b9:03 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b95b2427-7c9a-4d8d-bcfd-645393721cb5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-331f2e14-0579-41c0-b551-4fc605c604b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '526f66a0e3ca44b097d8ce7f4a763497', 'neutron:revision_number': '4', 'neutron:security_group_ids': '597979d2-7558-443c-8a4c-18c518fb0d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8c94da1-7797-46e4-848a-0d3e3ffcc75d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=a63fc129-9d70-44d9-a73a-cfa00b3264aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:18:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:52.911 104471 INFO neutron.agent.ovn.metadata.agent [-] Port a63fc129-9d70-44d9-a73a-cfa00b3264aa in datapath 331f2e14-0579-41c0-b551-4fc605c604b5 unbound from our chassis#033[00m
Dec  5 07:18:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:52.912 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 331f2e14-0579-41c0-b551-4fc605c604b5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:18:52 np0005546909 nova_compute[187208]: 2025-12-05 12:18:52.914 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:52.914 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[859331b6-08c9-4317-8b65-e8ba02c2a8d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:52 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:52.915 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5 namespace which is not needed anymore#033[00m
Dec  5 07:18:52 np0005546909 systemd[1]: machine-qemu\x2d128\x2dinstance\x2d00000067.scope: Deactivated successfully.
Dec  5 07:18:52 np0005546909 systemd[1]: machine-qemu\x2d128\x2dinstance\x2d00000067.scope: Consumed 4.043s CPU time.
Dec  5 07:18:52 np0005546909 systemd-machined[153543]: Machine qemu-128-instance-00000067 terminated.
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.137 187212 INFO nova.virt.libvirt.driver [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Instance destroyed successfully.#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.138 187212 DEBUG nova.objects.instance [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lazy-loading 'resources' on Instance uuid b95b2427-7c9a-4d8d-bcfd-645393721cb5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.154 187212 DEBUG nova.virt.libvirt.vif [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:18:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-257891300',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-servertagstestjson-server-257891300',id=103,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:18:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='526f66a0e3ca44b097d8ce7f4a763497',ramdisk_id='',reservation_id='r-glcipxwz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerTagsTestJSON-210151535',owner_user_name='tempest-ServerTagsTestJSON-210151535-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:18:48Z,user_data=None,user_id='7907c905947f4a2290c1cb23fc23e453',uuid=b95b2427-7c9a-4d8d-bcfd-645393721cb5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.154 187212 DEBUG nova.network.os_vif_util [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Converting VIF {"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.155 187212 DEBUG nova.network.os_vif_util [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.155 187212 DEBUG os_vif [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.157 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.157 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa63fc129-9d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.159 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.163 187212 INFO os_vif [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:b9:03,bridge_name='br-int',has_traffic_filtering=True,id=a63fc129-9d70-44d9-a73a-cfa00b3264aa,network=Network(331f2e14-0579-41c0-b551-4fc605c604b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa63fc129-9d')#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.164 187212 INFO nova.virt.libvirt.driver [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Deleting instance files /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5_del#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.165 187212 INFO nova.virt.libvirt.driver [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Deletion of /var/lib/nova/instances/b95b2427-7c9a-4d8d-bcfd-645393721cb5_del complete#033[00m
Dec  5 07:18:53 np0005546909 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [NOTICE]   (242035) : haproxy version is 2.8.14-c23fe91
Dec  5 07:18:53 np0005546909 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [NOTICE]   (242035) : path to executable is /usr/sbin/haproxy
Dec  5 07:18:53 np0005546909 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [WARNING]  (242035) : Exiting Master process...
Dec  5 07:18:53 np0005546909 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [WARNING]  (242035) : Exiting Master process...
Dec  5 07:18:53 np0005546909 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [ALERT]    (242035) : Current worker (242037) exited with code 143 (Terminated)
Dec  5 07:18:53 np0005546909 neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5[242031]: [WARNING]  (242035) : All workers exited. Exiting... (0)
Dec  5 07:18:53 np0005546909 systemd[1]: libpod-194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5.scope: Deactivated successfully.
Dec  5 07:18:53 np0005546909 podman[242068]: 2025-12-05 12:18:53.195958833 +0000 UTC m=+0.172315751 container died 194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 07:18:53 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5-userdata-shm.mount: Deactivated successfully.
Dec  5 07:18:53 np0005546909 systemd[1]: var-lib-containers-storage-overlay-78c6e2eb3a9f74cb7fddf3d8d585a32734ee1d05235538c68add3f2e68661886-merged.mount: Deactivated successfully.
Dec  5 07:18:53 np0005546909 podman[242068]: 2025-12-05 12:18:53.34195999 +0000 UTC m=+0.318316898 container cleanup 194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:18:53 np0005546909 systemd[1]: libpod-conmon-194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5.scope: Deactivated successfully.
Dec  5 07:18:53 np0005546909 podman[242114]: 2025-12-05 12:18:53.376752695 +0000 UTC m=+0.087835744 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.395 187212 INFO nova.compute.manager [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Took 0.55 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.395 187212 DEBUG oslo.service.loopingcall [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.396 187212 DEBUG nova.compute.manager [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.396 187212 DEBUG nova.network.neutron [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:18:53 np0005546909 podman[242149]: 2025-12-05 12:18:53.407418193 +0000 UTC m=+0.045130823 container remove 194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  5 07:18:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.439 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[34dabdb1-05a4-4f5c-8efd-af88c45b5acb]: (4, ('Fri Dec  5 12:18:53 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5 (194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5)\n194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5\nFri Dec  5 12:18:53 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5 (194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5)\n194a0a9d0562726dedc69f62cc352489a2bc354ddc46e6486ed7f624432c13c5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:53 np0005546909 podman[242115]: 2025-12-05 12:18:53.44053701 +0000 UTC m=+0.151166416 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:18:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.442 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[415252b6-18bd-44c4-86a4-cde2dea481a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.443 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap331f2e14-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.445 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:53 np0005546909 kernel: tap331f2e14-00: left promiscuous mode
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.458 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.460 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[504d67e5-d4f0-476d-bd4c-cf5aad41ed7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.475 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b31df3b6-40d1-4524-aae1-995c17f5c6c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.477 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[eecead2d-5e11-434f-8c2d-dcd3e1b90e63]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:53 np0005546909 podman[242116]: 2025-12-05 12:18:53.490358045 +0000 UTC m=+0.196583985 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:18:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.496 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0997a0d8-32cd-4a7c-a98c-5bfd1483bcde]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450562, 'reachable_time': 35861, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242194, 'error': None, 'target': 'ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:53 np0005546909 systemd[1]: run-netns-ovnmeta\x2d331f2e14\x2d0579\x2d41c0\x2db551\x2d4fc605c604b5.mount: Deactivated successfully.
Dec  5 07:18:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.500 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-331f2e14-0579-41c0-b551-4fc605c604b5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:18:53 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:53.500 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[271b5dba-f5db-473d-b022-efa6a0bcd72e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.734 187212 DEBUG nova.network.neutron [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Updating instance_info_cache with network_info: [{"id": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "address": "fa:16:3e:b1:b9:03", "network": {"id": "331f2e14-0579-41c0-b551-4fc605c604b5", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-502649288-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "526f66a0e3ca44b097d8ce7f4a763497", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa63fc129-9d", "ovs_interfaceid": "a63fc129-9d70-44d9-a73a-cfa00b3264aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.775 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Releasing lock "refresh_cache-b95b2427-7c9a-4d8d-bcfd-645393721cb5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.775 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.775 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.775 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.775 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.776 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 07:18:53 np0005546909 nova_compute[187208]: 2025-12-05 12:18:53.794 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:18:54 np0005546909 nova_compute[187208]: 2025-12-05 12:18:54.095 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:18:54 np0005546909 nova_compute[187208]: 2025-12-05 12:18:54.502 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:54 np0005546909 nova_compute[187208]: 2025-12-05 12:18:54.601 187212 DEBUG nova.network.neutron [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:18:54 np0005546909 nova_compute[187208]: 2025-12-05 12:18:54.623 187212 INFO nova.compute.manager [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Took 1.23 seconds to deallocate network for instance.#033[00m
Dec  5 07:18:54 np0005546909 nova_compute[187208]: 2025-12-05 12:18:54.663 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:54 np0005546909 nova_compute[187208]: 2025-12-05 12:18:54.663 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:54 np0005546909 nova_compute[187208]: 2025-12-05 12:18:54.830 187212 DEBUG nova.compute.manager [req-2e25c503-3b93-4b22-80ff-87e784509782 req-5de10c4e-f966-49bc-910b-1057f412a41e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Received event network-vif-deleted-a63fc129-9d70-44d9-a73a-cfa00b3264aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:18:55 np0005546909 nova_compute[187208]: 2025-12-05 12:18:55.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:18:55 np0005546909 nova_compute[187208]: 2025-12-05 12:18:55.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:18:55 np0005546909 nova_compute[187208]: 2025-12-05 12:18:55.094 187212 DEBUG nova.compute.provider_tree [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:18:55 np0005546909 nova_compute[187208]: 2025-12-05 12:18:55.112 187212 DEBUG nova.scheduler.client.report [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:18:55 np0005546909 nova_compute[187208]: 2025-12-05 12:18:55.143 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.480s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:55 np0005546909 nova_compute[187208]: 2025-12-05 12:18:55.322 187212 INFO nova.scheduler.client.report [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Deleted allocations for instance b95b2427-7c9a-4d8d-bcfd-645393721cb5#033[00m
Dec  5 07:18:55 np0005546909 nova_compute[187208]: 2025-12-05 12:18:55.402 187212 DEBUG oslo_concurrency.lockutils [None req-253a0f0c-2bc7-441e-9d34-3d8bc8ffc641 7907c905947f4a2290c1cb23fc23e453 526f66a0e3ca44b097d8ce7f4a763497 - - default default] Lock "b95b2427-7c9a-4d8d-bcfd-645393721cb5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:55.956 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:18:55 np0005546909 nova_compute[187208]: 2025-12-05 12:18:55.957 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:55 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:18:55.958 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.086 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.087 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.087 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.088 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.263 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.264 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5587MB free_disk=73.04059600830078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.264 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.264 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.316 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.317 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.338 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.350 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.384 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:18:56 np0005546909 nova_compute[187208]: 2025-12-05 12:18:56.384 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:18:58 np0005546909 nova_compute[187208]: 2025-12-05 12:18:58.160 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:18:59 np0005546909 nova_compute[187208]: 2025-12-05 12:18:59.380 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:18:59 np0005546909 nova_compute[187208]: 2025-12-05 12:18:59.504 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:00 np0005546909 nova_compute[187208]: 2025-12-05 12:19:00.107 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:03.022 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:03.023 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:03.024 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:03 np0005546909 nova_compute[187208]: 2025-12-05 12:19:03.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:19:03 np0005546909 nova_compute[187208]: 2025-12-05 12:19:03.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 07:19:03 np0005546909 nova_compute[187208]: 2025-12-05 12:19:03.077 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 07:19:03 np0005546909 nova_compute[187208]: 2025-12-05 12:19:03.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:04 np0005546909 podman[242196]: 2025-12-05 12:19:04.219746636 +0000 UTC m=+0.057544547 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:19:04 np0005546909 nova_compute[187208]: 2025-12-05 12:19:04.507 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:05 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:05.960 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.134 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937133.1328614, b95b2427-7c9a-4d8d-bcfd-645393721cb5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.135 187212 INFO nova.compute.manager [-] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.163 187212 DEBUG nova.compute.manager [None req-b53d5033-0024-418b-86ff-6195a98baee9 - - - - - -] [instance: b95b2427-7c9a-4d8d-bcfd-645393721cb5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.211 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.768 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.769 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.788 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.879 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.880 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.888 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.888 187212 INFO nova.compute.claims [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.962 187212 DEBUG nova.scheduler.client.report [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.988 187212 DEBUG nova.scheduler.client.report [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 07:19:08 np0005546909 nova_compute[187208]: 2025-12-05 12:19:08.988 187212 DEBUG nova.compute.provider_tree [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.006 187212 DEBUG nova.scheduler.client.report [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.034 187212 DEBUG nova.scheduler.client.report [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.075 187212 DEBUG nova.compute.provider_tree [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.090 187212 DEBUG nova.scheduler.client.report [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.116 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.116 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.168 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.186 187212 INFO nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.205 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.290 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.292 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.293 187212 INFO nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Creating image(s)#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.293 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "/var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.294 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "/var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.294 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "/var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.306 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.376 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.377 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.378 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.391 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.452 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.453 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.491 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.492 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.492 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.514 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.553 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.554 187212 DEBUG nova.virt.disk.api [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Checking if we can resize image /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.554 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.615 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.616 187212 DEBUG nova.virt.disk.api [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Cannot resize image /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.616 187212 DEBUG nova.objects.instance [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'migration_context' on Instance uuid 06e1cdc7-fc0d-4de0-baed-0876536b7ee1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.631 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.631 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Ensure instance console log exists: /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.632 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.632 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.633 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.635 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.640 187212 WARNING nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.647 187212 DEBUG nova.virt.libvirt.host [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.648 187212 DEBUG nova.virt.libvirt.host [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.653 187212 DEBUG nova.virt.libvirt.host [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.653 187212 DEBUG nova.virt.libvirt.host [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.654 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.654 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.655 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.655 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.655 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.656 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.656 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.657 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.657 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.657 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.657 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.658 187212 DEBUG nova.virt.hardware [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.664 187212 DEBUG nova.objects.instance [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'pci_devices' on Instance uuid 06e1cdc7-fc0d-4de0-baed-0876536b7ee1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.687 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  <uuid>06e1cdc7-fc0d-4de0-baed-0876536b7ee1</uuid>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  <name>instance-00000068</name>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerShowV247Test-server-1392417467</nova:name>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:19:09</nova:creationTime>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:19:09 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:        <nova:user uuid="b0d9487c1e0a49ad9ca1c5ebe37d4ed3">tempest-ServerShowV247Test-1738469039-project-member</nova:user>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:        <nova:project uuid="25d7882911914ef5ae762cbd5dc95a3a">tempest-ServerShowV247Test-1738469039</nova:project>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <entry name="serial">06e1cdc7-fc0d-4de0-baed-0876536b7ee1</entry>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <entry name="uuid">06e1cdc7-fc0d-4de0-baed-0876536b7ee1</entry>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.config"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/console.log" append="off"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:19:09 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:19:09 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:19:09 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:19:09 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.729 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.729 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:19:09 np0005546909 nova_compute[187208]: 2025-12-05 12:19:09.730 187212 INFO nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Using config drive#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.193 187212 INFO nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Creating config drive at /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.config#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.198 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp8upwpe3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:10 np0005546909 podman[242235]: 2025-12-05 12:19:10.20738144 +0000 UTC m=+0.058750182 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible)
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.332 187212 DEBUG oslo_concurrency.processutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp8upwpe3" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:10 np0005546909 systemd-machined[153543]: New machine qemu-129-instance-00000068.
Dec  5 07:19:10 np0005546909 systemd[1]: Started Virtual Machine qemu-129-instance-00000068.
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.727 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937150.7270656, 06e1cdc7-fc0d-4de0-baed-0876536b7ee1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.727 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.730 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.731 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.734 187212 INFO nova.virt.libvirt.driver [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Instance spawned successfully.#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.734 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.753 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.758 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.762 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.763 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.763 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.763 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.764 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.764 187212 DEBUG nova.virt.libvirt.driver [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.793 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.794 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937150.7280824, 06e1cdc7-fc0d-4de0-baed-0876536b7ee1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.794 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] VM Started (Lifecycle Event)#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.822 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.826 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.842 187212 INFO nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Took 1.55 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.842 187212 DEBUG nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.848 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.856 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "4cdf5703-a103-4583-9e40-a33e86b5bf04" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.856 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "4cdf5703-a103-4583-9e40-a33e86b5bf04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.883 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.916 187212 INFO nova.compute.manager [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Took 2.07 seconds to build instance.#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.939 187212 DEBUG oslo_concurrency.lockutils [None req-84e01d86-f095-4383-9b07-2aaec59e9e65 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.951 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.951 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.962 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:19:10 np0005546909 nova_compute[187208]: 2025-12-05 12:19:10.963 187212 INFO nova.compute.claims [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.077 187212 DEBUG nova.compute.provider_tree [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.093 187212 DEBUG nova.scheduler.client.report [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.114 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.115 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.160 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.174 187212 INFO nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.196 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.272 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.273 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.274 187212 INFO nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Creating image(s)#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.274 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.275 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.276 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.292 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.361 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.362 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.363 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.375 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.438 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.440 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.475 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.477 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.477 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.535 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.536 187212 DEBUG nova.virt.disk.api [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Checking if we can resize image /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.536 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.594 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.595 187212 DEBUG nova.virt.disk.api [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Cannot resize image /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.596 187212 DEBUG nova.objects.instance [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'migration_context' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.610 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.610 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Ensure instance console log exists: /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.611 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.611 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.612 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.614 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.617 187212 WARNING nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.622 187212 DEBUG nova.virt.libvirt.host [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.622 187212 DEBUG nova.virt.libvirt.host [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.625 187212 DEBUG nova.virt.libvirt.host [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.626 187212 DEBUG nova.virt.libvirt.host [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.626 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.627 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.627 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.628 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.628 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.629 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.629 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.629 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.630 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.630 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.630 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.631 187212 DEBUG nova.virt.hardware [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.636 187212 DEBUG nova.objects.instance [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'pci_devices' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.653 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  <uuid>4cdf5703-a103-4583-9e40-a33e86b5bf04</uuid>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  <name>instance-00000069</name>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerShowV247Test-server-538809939</nova:name>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:19:11</nova:creationTime>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:19:11 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:        <nova:user uuid="b0d9487c1e0a49ad9ca1c5ebe37d4ed3">tempest-ServerShowV247Test-1738469039-project-member</nova:user>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:        <nova:project uuid="25d7882911914ef5ae762cbd5dc95a3a">tempest-ServerShowV247Test-1738469039</nova:project>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <entry name="serial">4cdf5703-a103-4583-9e40-a33e86b5bf04</entry>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <entry name="uuid">4cdf5703-a103-4583-9e40-a33e86b5bf04</entry>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/console.log" append="off"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:19:11 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:19:11 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:19:11 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:19:11 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.721 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.721 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:19:11 np0005546909 nova_compute[187208]: 2025-12-05 12:19:11.722 187212 INFO nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Using config drive#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.444 187212 INFO nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Creating config drive at /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.451 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuedvgwse execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.578 187212 DEBUG oslo_concurrency.processutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuedvgwse" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:12 np0005546909 systemd-machined[153543]: New machine qemu-130-instance-00000069.
Dec  5 07:19:12 np0005546909 systemd[1]: Started Virtual Machine qemu-130-instance-00000069.
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.903 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937152.9027894, 4cdf5703-a103-4583-9e40-a33e86b5bf04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.903 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.906 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.906 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.910 187212 INFO nova.virt.libvirt.driver [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance spawned successfully.#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.911 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.929 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.934 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.941 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.942 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.943 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.943 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.944 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.944 187212 DEBUG nova.virt.libvirt.driver [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.979 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.979 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937152.9029043, 4cdf5703-a103-4583-9e40-a33e86b5bf04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:12 np0005546909 nova_compute[187208]: 2025-12-05 12:19:12.980 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] VM Started (Lifecycle Event)#033[00m
Dec  5 07:19:13 np0005546909 nova_compute[187208]: 2025-12-05 12:19:13.009 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:13 np0005546909 nova_compute[187208]: 2025-12-05 12:19:13.012 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:19:13 np0005546909 nova_compute[187208]: 2025-12-05 12:19:13.019 187212 INFO nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Took 1.75 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:19:13 np0005546909 nova_compute[187208]: 2025-12-05 12:19:13.019 187212 DEBUG nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:13 np0005546909 nova_compute[187208]: 2025-12-05 12:19:13.030 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:19:13 np0005546909 nova_compute[187208]: 2025-12-05 12:19:13.072 187212 INFO nova.compute.manager [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Took 2.14 seconds to build instance.#033[00m
Dec  5 07:19:13 np0005546909 nova_compute[187208]: 2025-12-05 12:19:13.094 187212 DEBUG oslo_concurrency.lockutils [None req-0d0b474f-7cd7-454e-b1da-d815861d5128 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "4cdf5703-a103-4583-9e40-a33e86b5bf04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:13 np0005546909 nova_compute[187208]: 2025-12-05 12:19:13.215 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:14 np0005546909 podman[242326]: 2025-12-05 12:19:14.203445494 +0000 UTC m=+0.055766137 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  5 07:19:14 np0005546909 podman[242325]: 2025-12-05 12:19:14.210099334 +0000 UTC m=+0.064151676 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm)
Dec  5 07:19:14 np0005546909 nova_compute[187208]: 2025-12-05 12:19:14.510 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:15 np0005546909 nova_compute[187208]: 2025-12-05 12:19:15.199 187212 INFO nova.compute.manager [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Rebuilding instance#033[00m
Dec  5 07:19:15 np0005546909 nova_compute[187208]: 2025-12-05 12:19:15.530 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:15 np0005546909 nova_compute[187208]: 2025-12-05 12:19:15.561 187212 DEBUG nova.compute.manager [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:15 np0005546909 nova_compute[187208]: 2025-12-05 12:19:15.636 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'pci_requests' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:15 np0005546909 nova_compute[187208]: 2025-12-05 12:19:15.651 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'pci_devices' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:15 np0005546909 nova_compute[187208]: 2025-12-05 12:19:15.666 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'resources' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:15 np0005546909 nova_compute[187208]: 2025-12-05 12:19:15.680 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'migration_context' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:15 np0005546909 nova_compute[187208]: 2025-12-05 12:19:15.695 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:19:15 np0005546909 nova_compute[187208]: 2025-12-05 12:19:15.700 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  5 07:19:18 np0005546909 nova_compute[187208]: 2025-12-05 12:19:18.217 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:19 np0005546909 nova_compute[187208]: 2025-12-05 12:19:19.512 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:23 np0005546909 nova_compute[187208]: 2025-12-05 12:19:23.221 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:24 np0005546909 podman[242394]: 2025-12-05 12:19:24.198722072 +0000 UTC m=+0.051139054 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec  5 07:19:24 np0005546909 podman[242393]: 2025-12-05 12:19:24.201005998 +0000 UTC m=+0.055886890 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 07:19:24 np0005546909 podman[242395]: 2025-12-05 12:19:24.234078184 +0000 UTC m=+0.083098448 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 07:19:24 np0005546909 nova_compute[187208]: 2025-12-05 12:19:24.513 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:25 np0005546909 nova_compute[187208]: 2025-12-05 12:19:25.744 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Dec  5 07:19:27 np0005546909 systemd[1]: machine-qemu\x2d130\x2dinstance\x2d00000069.scope: Deactivated successfully.
Dec  5 07:19:27 np0005546909 systemd[1]: machine-qemu\x2d130\x2dinstance\x2d00000069.scope: Consumed 12.021s CPU time.
Dec  5 07:19:27 np0005546909 systemd-machined[153543]: Machine qemu-130-instance-00000069 terminated.
Dec  5 07:19:28 np0005546909 nova_compute[187208]: 2025-12-05 12:19:28.223 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:28 np0005546909 nova_compute[187208]: 2025-12-05 12:19:28.759 187212 INFO nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance shutdown successfully after 13 seconds.#033[00m
Dec  5 07:19:28 np0005546909 nova_compute[187208]: 2025-12-05 12:19:28.764 187212 INFO nova.virt.libvirt.driver [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance destroyed successfully.#033[00m
Dec  5 07:19:28 np0005546909 nova_compute[187208]: 2025-12-05 12:19:28.769 187212 INFO nova.virt.libvirt.driver [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance destroyed successfully.#033[00m
Dec  5 07:19:28 np0005546909 nova_compute[187208]: 2025-12-05 12:19:28.770 187212 INFO nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Deleting instance files /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04_del#033[00m
Dec  5 07:19:28 np0005546909 nova_compute[187208]: 2025-12-05 12:19:28.770 187212 INFO nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Deletion of /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04_del complete#033[00m
Dec  5 07:19:28 np0005546909 nova_compute[187208]: 2025-12-05 12:19:28.956 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:19:28 np0005546909 nova_compute[187208]: 2025-12-05 12:19:28.956 187212 INFO nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Creating image(s)#033[00m
Dec  5 07:19:28 np0005546909 nova_compute[187208]: 2025-12-05 12:19:28.957 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:28 np0005546909 nova_compute[187208]: 2025-12-05 12:19:28.957 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:28 np0005546909 nova_compute[187208]: 2025-12-05 12:19:28.958 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:28 np0005546909 nova_compute[187208]: 2025-12-05 12:19:28.972 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.031 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.032 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "1e39c16656988ee114089078431239bf806417db" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.033 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "1e39c16656988ee114089078431239bf806417db" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.053 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.112 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.113 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.149 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db,backing_fmt=raw /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.150 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "1e39c16656988ee114089078431239bf806417db" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.150 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.209 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.210 187212 DEBUG nova.virt.disk.api [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Checking if we can resize image /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.210 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.268 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.270 187212 DEBUG nova.virt.disk.api [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Cannot resize image /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.270 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.270 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Ensure instance console log exists: /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.271 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.271 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.272 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.273 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.277 187212 WARNING nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.298 187212 DEBUG nova.virt.libvirt.host [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.299 187212 DEBUG nova.virt.libvirt.host [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.304 187212 DEBUG nova.virt.libvirt.host [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.305 187212 DEBUG nova.virt.libvirt.host [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.305 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.305 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:10Z,direct_url=<?>,disk_format='qcow2',id=6e277715-617f-4e35-89c7-208beae9fd5c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.306 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.306 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.306 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.307 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.307 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.307 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.308 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.308 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.308 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.308 187212 DEBUG nova.virt.hardware [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.309 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.330 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  <uuid>4cdf5703-a103-4583-9e40-a33e86b5bf04</uuid>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  <name>instance-00000069</name>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <nova:name>tempest-ServerShowV247Test-server-538809939</nova:name>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:19:29</nova:creationTime>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:19:29 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:        <nova:user uuid="b0d9487c1e0a49ad9ca1c5ebe37d4ed3">tempest-ServerShowV247Test-1738469039-project-member</nova:user>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:        <nova:project uuid="25d7882911914ef5ae762cbd5dc95a3a">tempest-ServerShowV247Test-1738469039</nova:project>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="6e277715-617f-4e35-89c7-208beae9fd5c"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <nova:ports/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <entry name="serial">4cdf5703-a103-4583-9e40-a33e86b5bf04</entry>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <entry name="uuid">4cdf5703-a103-4583-9e40-a33e86b5bf04</entry>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/console.log" append="off"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:19:29 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:19:29 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:19:29 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:19:29 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.389 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.389 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.390 187212 INFO nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Using config drive#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.408 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.448 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'keypairs' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.515 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.829 187212 INFO nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Creating config drive at /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.834 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpley9fnfz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:29 np0005546909 nova_compute[187208]: 2025-12-05 12:19:29.959 187212 DEBUG oslo_concurrency.processutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpley9fnfz" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:30 np0005546909 systemd-machined[153543]: New machine qemu-131-instance-00000069.
Dec  5 07:19:30 np0005546909 systemd[1]: Started Virtual Machine qemu-131-instance-00000069.
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.444 187212 DEBUG nova.virt.libvirt.host [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Removed pending event for 4cdf5703-a103-4583-9e40-a33e86b5bf04 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.445 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937170.4435465, 4cdf5703-a103-4583-9e40-a33e86b5bf04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.445 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.449 187212 DEBUG nova.compute.manager [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.449 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.453 187212 INFO nova.virt.libvirt.driver [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance spawned successfully.#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.454 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.479 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.486 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.491 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.491 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.492 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.492 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.493 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.494 187212 DEBUG nova.virt.libvirt.driver [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.519 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.520 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937170.4449296, 4cdf5703-a103-4583-9e40-a33e86b5bf04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.520 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] VM Started (Lifecycle Event)#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.540 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.541 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.548 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.552 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.558 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.585 187212 DEBUG nova.compute.manager [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.587 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.675 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.675 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.676 187212 DEBUG nova.objects.instance [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.693 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.744 187212 DEBUG oslo_concurrency.lockutils [None req-1ed7df9b-2622-4f4b-96b7-545bcb781bbd b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.745 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.752 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.752 187212 INFO nova.compute.claims [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.913 187212 DEBUG nova.compute.provider_tree [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.928 187212 DEBUG nova.scheduler.client.report [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.957 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:30 np0005546909 nova_compute[187208]: 2025-12-05 12:19:30.958 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.011 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.012 187212 DEBUG nova.network.neutron [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.031 187212 INFO nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.049 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.157 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.158 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.159 187212 INFO nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Creating image(s)#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.161 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "/var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.161 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "/var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.162 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "/var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.175 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.245 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.246 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.246 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.260 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.317 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.318 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.390 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk 1073741824" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.392 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.393 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.461 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.462 187212 DEBUG nova.virt.disk.api [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Checking if we can resize image /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.463 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.491 187212 DEBUG nova.policy [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '223f7822261946cc9228b2207bd1096c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3463fde58c6c4bea98c82b2cb087a0dd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.527 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.528 187212 DEBUG nova.virt.disk.api [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Cannot resize image /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.529 187212 DEBUG nova.objects.instance [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lazy-loading 'migration_context' on Instance uuid 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.543 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.544 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Ensure instance console log exists: /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.544 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.544 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:31 np0005546909 nova_compute[187208]: 2025-12-05 12:19:31.545 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.109 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "4cdf5703-a103-4583-9e40-a33e86b5bf04" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.110 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "4cdf5703-a103-4583-9e40-a33e86b5bf04" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.110 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "4cdf5703-a103-4583-9e40-a33e86b5bf04-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.110 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "4cdf5703-a103-4583-9e40-a33e86b5bf04-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.111 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "4cdf5703-a103-4583-9e40-a33e86b5bf04-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.112 187212 INFO nova.compute.manager [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Terminating instance#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.113 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "refresh_cache-4cdf5703-a103-4583-9e40-a33e86b5bf04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.113 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquired lock "refresh_cache-4cdf5703-a103-4583-9e40-a33e86b5bf04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.113 187212 DEBUG nova.network.neutron [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.418 187212 DEBUG nova.network.neutron [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.511 187212 DEBUG nova.network.neutron [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Successfully created port: f7a08175-a5c6-45b7-b194-819c5b881995 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.741 187212 DEBUG nova.network.neutron [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.756 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Releasing lock "refresh_cache-4cdf5703-a103-4583-9e40-a33e86b5bf04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:19:33 np0005546909 nova_compute[187208]: 2025-12-05 12:19:33.756 187212 DEBUG nova.compute.manager [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:19:33 np0005546909 systemd[1]: machine-qemu\x2d131\x2dinstance\x2d00000069.scope: Deactivated successfully.
Dec  5 07:19:33 np0005546909 systemd[1]: machine-qemu\x2d131\x2dinstance\x2d00000069.scope: Consumed 3.761s CPU time.
Dec  5 07:19:33 np0005546909 systemd-machined[153543]: Machine qemu-131-instance-00000069 terminated.
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.004 187212 INFO nova.virt.libvirt.driver [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance destroyed successfully.#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.005 187212 DEBUG nova.objects.instance [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'resources' on Instance uuid 4cdf5703-a103-4583-9e40-a33e86b5bf04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.023 187212 INFO nova.virt.libvirt.driver [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Deleting instance files /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04_del#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.023 187212 INFO nova.virt.libvirt.driver [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Deletion of /var/lib/nova/instances/4cdf5703-a103-4583-9e40-a33e86b5bf04_del complete#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.073 187212 INFO nova.compute.manager [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Took 0.32 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.073 187212 DEBUG oslo.service.loopingcall [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.074 187212 DEBUG nova.compute.manager [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.074 187212 DEBUG nova.network.neutron [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.507 187212 DEBUG nova.network.neutron [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Successfully updated port: f7a08175-a5c6-45b7-b194-819c5b881995 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.517 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.529 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "refresh_cache-3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.529 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquired lock "refresh_cache-3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.530 187212 DEBUG nova.network.neutron [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.644 187212 DEBUG nova.network.neutron [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.662 187212 DEBUG nova.network.neutron [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.697 187212 INFO nova.compute.manager [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Took 0.62 seconds to deallocate network for instance.#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.753 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.754 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.773 187212 DEBUG nova.compute.manager [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-changed-f7a08175-a5c6-45b7-b194-819c5b881995 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.773 187212 DEBUG nova.compute.manager [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Refreshing instance network info cache due to event network-changed-f7a08175-a5c6-45b7-b194-819c5b881995. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.774 187212 DEBUG oslo_concurrency.lockutils [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.845 187212 DEBUG nova.compute.provider_tree [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.859 187212 DEBUG nova.scheduler.client.report [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.880 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.916 187212 INFO nova.scheduler.client.report [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Deleted allocations for instance 4cdf5703-a103-4583-9e40-a33e86b5bf04#033[00m
Dec  5 07:19:34 np0005546909 nova_compute[187208]: 2025-12-05 12:19:34.993 187212 DEBUG oslo_concurrency.lockutils [None req-ee70edc2-49d3-42e9-a418-b189113936f9 b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "4cdf5703-a103-4583-9e40-a33e86b5bf04" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.883s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:35 np0005546909 nova_compute[187208]: 2025-12-05 12:19:35.058 187212 DEBUG nova.network.neutron [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:19:35 np0005546909 podman[242534]: 2025-12-05 12:19:35.199206319 +0000 UTC m=+0.053827521 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.134 187212 DEBUG nova.network.neutron [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Updating instance_info_cache with network_info: [{"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.166 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Releasing lock "refresh_cache-3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.166 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Instance network_info: |[{"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.167 187212 DEBUG oslo_concurrency.lockutils [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.167 187212 DEBUG nova.network.neutron [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Refreshing network info cache for port f7a08175-a5c6-45b7-b194-819c5b881995 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.172 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Start _get_guest_xml network_info=[{"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.181 187212 WARNING nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.186 187212 DEBUG nova.virt.libvirt.host [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.187 187212 DEBUG nova.virt.libvirt.host [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.190 187212 DEBUG nova.virt.libvirt.host [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.190 187212 DEBUG nova.virt.libvirt.host [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.191 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.191 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.192 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.192 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.192 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.193 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.193 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.193 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.193 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.193 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.194 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.194 187212 DEBUG nova.virt.hardware [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.200 187212 DEBUG nova.virt.libvirt.vif [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:19:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-793938802',display_name='tempest-VolumesActionsTest-instance-793938802',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-793938802',id=106,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3463fde58c6c4bea98c82b2cb087a0dd',ramdisk_id='',reservation_id='r-6clntd3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1057905007',owner_user_name='tempest-VolumesActionsTest-1057905007-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:19:31Z,user_data=None,user_id='223f7822261946cc9228b2207bd1096c',uuid=3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.200 187212 DEBUG nova.network.os_vif_util [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converting VIF {"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.201 187212 DEBUG nova.network.os_vif_util [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.204 187212 DEBUG nova.objects.instance [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.220 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  <uuid>3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e</uuid>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  <name>instance-0000006a</name>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <nova:name>tempest-VolumesActionsTest-instance-793938802</nova:name>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:19:36</nova:creationTime>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:        <nova:user uuid="223f7822261946cc9228b2207bd1096c">tempest-VolumesActionsTest-1057905007-project-member</nova:user>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:        <nova:project uuid="3463fde58c6c4bea98c82b2cb087a0dd">tempest-VolumesActionsTest-1057905007</nova:project>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:        <nova:port uuid="f7a08175-a5c6-45b7-b194-819c5b881995">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <entry name="serial">3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e</entry>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <entry name="uuid">3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e</entry>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.config"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:b6:75:08"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <target dev="tapf7a08175-a5"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/console.log" append="off"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:19:36 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:19:36 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:19:36 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:19:36 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.221 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Preparing to wait for external event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.222 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.222 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.222 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.223 187212 DEBUG nova.virt.libvirt.vif [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:19:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-793938802',display_name='tempest-VolumesActionsTest-instance-793938802',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-793938802',id=106,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3463fde58c6c4bea98c82b2cb087a0dd',ramdisk_id='',reservation_id='r-6clntd3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1057905007',owner_user_name='tempest-VolumesActionsTest-1057905007-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:19:31Z,user_data=None,user_id='223f7822261946cc9228b2207bd1096c',uuid=3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.223 187212 DEBUG nova.network.os_vif_util [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converting VIF {"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.224 187212 DEBUG nova.network.os_vif_util [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.224 187212 DEBUG os_vif [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.225 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.225 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.226 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.230 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.231 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7a08175-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.231 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf7a08175-a5, col_values=(('external_ids', {'iface-id': 'f7a08175-a5c6-45b7-b194-819c5b881995', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b6:75:08', 'vm-uuid': '3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.405 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:36 np0005546909 NetworkManager[55691]: <info>  [1764937176.4063] manager: (tapf7a08175-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/426)
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.407 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.411 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.415 187212 INFO os_vif [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5')#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.476 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.476 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.477 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] No VIF found with MAC fa:16:3e:b6:75:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.477 187212 INFO nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Using config drive#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.600 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.601 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.602 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.602 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.603 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.605 187212 INFO nova.compute.manager [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Terminating instance#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.606 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "refresh_cache-06e1cdc7-fc0d-4de0-baed-0876536b7ee1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.607 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquired lock "refresh_cache-06e1cdc7-fc0d-4de0-baed-0876536b7ee1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.607 187212 DEBUG nova.network.neutron [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:19:36 np0005546909 nova_compute[187208]: 2025-12-05 12:19:36.976 187212 DEBUG nova.network.neutron [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.196 187212 INFO nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Creating config drive at /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.config#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.201 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_3cqb5n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.331 187212 DEBUG oslo_concurrency.processutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn_3cqb5n" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.372 187212 DEBUG nova.network.neutron [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.387 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Releasing lock "refresh_cache-06e1cdc7-fc0d-4de0-baed-0876536b7ee1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.388 187212 DEBUG nova.compute.manager [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:19:37 np0005546909 kernel: tapf7a08175-a5: entered promiscuous mode
Dec  5 07:19:37 np0005546909 NetworkManager[55691]: <info>  [1764937177.4165] manager: (tapf7a08175-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/427)
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.416 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:37 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:37Z|01118|binding|INFO|Claiming lport f7a08175-a5c6-45b7-b194-819c5b881995 for this chassis.
Dec  5 07:19:37 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:37Z|01119|binding|INFO|f7a08175-a5c6-45b7-b194-819c5b881995: Claiming fa:16:3e:b6:75:08 10.100.0.11
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.458 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:75:08 10.100.0.11'], port_security=['fa:16:3e:b6:75:08 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52916d9d-eb76-4677-8333-d02c9507adbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3463fde58c6c4bea98c82b2cb087a0dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '21c17f9f-419d-4f4a-937b-418d08c504db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9ca61dc-1564-41a5-9908-f7ecfadeff73, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f7a08175-a5c6-45b7-b194-819c5b881995) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.459 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.460 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f7a08175-a5c6-45b7-b194-819c5b881995 in datapath 52916d9d-eb76-4677-8333-d02c9507adbc bound to our chassis#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.461 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 52916d9d-eb76-4677-8333-d02c9507adbc#033[00m
Dec  5 07:19:37 np0005546909 systemd-udevd[242580]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.477 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5734d1f4-5460-483d-b2ff-49f2939bc990]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.478 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap52916d9d-e1 in ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.480 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap52916d9d-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.480 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[cdacad6b-90bd-4174-a26f-5574ba4ea406]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.481 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2318c9ab-5912-445a-9a51-4c5c39c4f62d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 systemd[1]: machine-qemu\x2d129\x2dinstance\x2d00000068.scope: Deactivated successfully.
Dec  5 07:19:37 np0005546909 systemd[1]: machine-qemu\x2d129\x2dinstance\x2d00000068.scope: Consumed 12.818s CPU time.
Dec  5 07:19:37 np0005546909 NetworkManager[55691]: <info>  [1764937177.4946] device (tapf7a08175-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.493 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[b42d40c7-0752-4863-995c-88788d1bfd97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 NetworkManager[55691]: <info>  [1764937177.4957] device (tapf7a08175-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:19:37 np0005546909 systemd-machined[153543]: Machine qemu-129-instance-00000068 terminated.
Dec  5 07:19:37 np0005546909 systemd-machined[153543]: New machine qemu-132-instance-0000006a.
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.509 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b2d97fee-9df8-4073-871c-978863fac552]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.511 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:37 np0005546909 systemd[1]: Started Virtual Machine qemu-132-instance-0000006a.
Dec  5 07:19:37 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:37Z|01120|binding|INFO|Setting lport f7a08175-a5c6-45b7-b194-819c5b881995 ovn-installed in OVS
Dec  5 07:19:37 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:37Z|01121|binding|INFO|Setting lport f7a08175-a5c6-45b7-b194-819c5b881995 up in Southbound
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.516 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.552 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[97a1af6f-fd33-4180-8e8c-87757eeb5f33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.557 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c2783415-aee0-4e6f-8949-945dee034afe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 NetworkManager[55691]: <info>  [1764937177.5585] manager: (tap52916d9d-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/428)
Dec  5 07:19:37 np0005546909 systemd-udevd[242579]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.594 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[873917d7-39c9-43a3-a02e-d8e398f0174f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.598 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[35771e56-e896-4f63-b6da-34730cc44b87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 NetworkManager[55691]: <info>  [1764937177.6274] device (tap52916d9d-e0): carrier: link connected
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.630 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[71ece242-78b8-42a3-8716-cc5a835e9572]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.635 187212 INFO nova.virt.libvirt.driver [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Instance destroyed successfully.#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.636 187212 DEBUG nova.objects.instance [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lazy-loading 'resources' on Instance uuid 06e1cdc7-fc0d-4de0-baed-0876536b7ee1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.650 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[7b822629-72bc-4dac-bd41-220aa929ca64]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52916d9d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:c4:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 309], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455818, 'reachable_time': 38112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242620, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.654 187212 INFO nova.virt.libvirt.driver [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Deleting instance files /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1_del#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.655 187212 INFO nova.virt.libvirt.driver [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Deletion of /var/lib/nova/instances/06e1cdc7-fc0d-4de0-baed-0876536b7ee1_del complete#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.666 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bf60c6c8-af16-4cc1-a3d8-21f21aebedc8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:c462'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455818, 'tstamp': 455818}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242621, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.684 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8709c161-9786-4d1a-beae-1f99e142ac4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52916d9d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:c4:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 309], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455818, 'reachable_time': 38112, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242622, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.715 187212 INFO nova.compute.manager [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.715 187212 DEBUG oslo.service.loopingcall [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.715 187212 DEBUG nova.compute.manager [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.716 187212 DEBUG nova.network.neutron [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.716 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3df1e73c-cb62-45ee-a48e-970a6ca360c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.786 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab95947-9868-4d71-a171-97a892832230]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.788 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52916d9d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.789 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.789 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52916d9d-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.791 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:37 np0005546909 kernel: tap52916d9d-e0: entered promiscuous mode
Dec  5 07:19:37 np0005546909 NetworkManager[55691]: <info>  [1764937177.7920] manager: (tap52916d9d-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/429)
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.793 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.794 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap52916d9d-e0, col_values=(('external_ids', {'iface-id': 'bfd2a34a-bdd5-4486-82a8-fc55b6e1020a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.795 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:37 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:37Z|01122|binding|INFO|Releasing lport bfd2a34a-bdd5-4486-82a8-fc55b6e1020a from this chassis (sb_readonly=0)
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.796 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.797 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/52916d9d-eb76-4677-8333-d02c9507adbc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/52916d9d-eb76-4677-8333-d02c9507adbc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.806 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[51b7d638-3eca-403c-bbe1-87f0b0d05fd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:37 np0005546909 nova_compute[187208]: 2025-12-05 12:19:37.807 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.808 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-52916d9d-eb76-4677-8333-d02c9507adbc
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/52916d9d-eb76-4677-8333-d02c9507adbc.pid.haproxy
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 52916d9d-eb76-4677-8333-d02c9507adbc
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:19:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:37.810 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'env', 'PROCESS_TAG=haproxy-52916d9d-eb76-4677-8333-d02c9507adbc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/52916d9d-eb76-4677-8333-d02c9507adbc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.132 187212 DEBUG nova.compute.manager [req-b35ce780-9cb8-4bde-88ef-aef77ed69a7b req-5f0e75de-c889-4052-8c76-096d21acb98e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.133 187212 DEBUG oslo_concurrency.lockutils [req-b35ce780-9cb8-4bde-88ef-aef77ed69a7b req-5f0e75de-c889-4052-8c76-096d21acb98e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.134 187212 DEBUG oslo_concurrency.lockutils [req-b35ce780-9cb8-4bde-88ef-aef77ed69a7b req-5f0e75de-c889-4052-8c76-096d21acb98e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.134 187212 DEBUG oslo_concurrency.lockutils [req-b35ce780-9cb8-4bde-88ef-aef77ed69a7b req-5f0e75de-c889-4052-8c76-096d21acb98e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.134 187212 DEBUG nova.compute.manager [req-b35ce780-9cb8-4bde-88ef-aef77ed69a7b req-5f0e75de-c889-4052-8c76-096d21acb98e 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Processing event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.145 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937178.1450875, 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.146 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] VM Started (Lifecycle Event)#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.148 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.153 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.157 187212 INFO nova.virt.libvirt.driver [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Instance spawned successfully.#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.157 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.177 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.183 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.188 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.188 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.189 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.189 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.190 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.190 187212 DEBUG nova.virt.libvirt.driver [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.218 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.220 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937178.1478107, 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.220 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:19:38 np0005546909 podman[242661]: 2025-12-05 12:19:38.231673213 +0000 UTC m=+0.053587434 container create a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.244 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.251 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937178.1524527, 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.252 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.254 187212 INFO nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Took 7.10 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.255 187212 DEBUG nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.267 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:38 np0005546909 systemd[1]: Started libpod-conmon-a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0.scope.
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.272 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:19:38 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:19:38 np0005546909 podman[242661]: 2025-12-05 12:19:38.202838588 +0000 UTC m=+0.024752829 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:19:38 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe9e174fa0d1960b337c7b5c186b03bdd5ebfcd106c3c3e9d8495dbd20d710fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.303 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:19:38 np0005546909 podman[242661]: 2025-12-05 12:19:38.319411793 +0000 UTC m=+0.141326024 container init a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  5 07:19:38 np0005546909 podman[242661]: 2025-12-05 12:19:38.325147897 +0000 UTC m=+0.147062108 container start a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.330 187212 INFO nova.compute.manager [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Took 7.68 seconds to build instance.#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.347 187212 DEBUG oslo_concurrency.lockutils [None req-2b55395a-036b-4071-951a-3f8b340b08e8 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:38 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [NOTICE]   (242682) : New worker (242684) forked
Dec  5 07:19:38 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [NOTICE]   (242682) : Loading success.
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.482 187212 DEBUG nova.network.neutron [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.539 187212 DEBUG nova.network.neutron [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.553 187212 INFO nova.compute.manager [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Took 0.84 seconds to deallocate network for instance.#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.607 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.610 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.697 187212 DEBUG nova.compute.provider_tree [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.712 187212 DEBUG nova.scheduler.client.report [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.737 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.779 187212 INFO nova.scheduler.client.report [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Deleted allocations for instance 06e1cdc7-fc0d-4de0-baed-0876536b7ee1#033[00m
Dec  5 07:19:38 np0005546909 nova_compute[187208]: 2025-12-05 12:19:38.842 187212 DEBUG oslo_concurrency.lockutils [None req-98dc242c-ca21-4f0b-a7da-b307b12206db b0d9487c1e0a49ad9ca1c5ebe37d4ed3 25d7882911914ef5ae762cbd5dc95a3a - - default default] Lock "06e1cdc7-fc0d-4de0-baed-0876536b7ee1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:39 np0005546909 nova_compute[187208]: 2025-12-05 12:19:39.220 187212 DEBUG nova.network.neutron [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Updated VIF entry in instance network info cache for port f7a08175-a5c6-45b7-b194-819c5b881995. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:19:39 np0005546909 nova_compute[187208]: 2025-12-05 12:19:39.223 187212 DEBUG nova.network.neutron [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Updating instance_info_cache with network_info: [{"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:19:39 np0005546909 nova_compute[187208]: 2025-12-05 12:19:39.242 187212 DEBUG oslo_concurrency.lockutils [req-19df6fb8-e154-4e6a-a1e5-28d819a0727d req-13904362-5689-4543-8b46-75d1033724d6 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:19:39 np0005546909 nova_compute[187208]: 2025-12-05 12:19:39.518 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:40 np0005546909 nova_compute[187208]: 2025-12-05 12:19:40.320 187212 DEBUG nova.compute.manager [req-0444da2b-ed3c-47b6-a790-761c8c531d85 req-c3037394-9658-403e-acb3-c4898543253b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:19:40 np0005546909 nova_compute[187208]: 2025-12-05 12:19:40.321 187212 DEBUG oslo_concurrency.lockutils [req-0444da2b-ed3c-47b6-a790-761c8c531d85 req-c3037394-9658-403e-acb3-c4898543253b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:40 np0005546909 nova_compute[187208]: 2025-12-05 12:19:40.322 187212 DEBUG oslo_concurrency.lockutils [req-0444da2b-ed3c-47b6-a790-761c8c531d85 req-c3037394-9658-403e-acb3-c4898543253b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:40 np0005546909 nova_compute[187208]: 2025-12-05 12:19:40.322 187212 DEBUG oslo_concurrency.lockutils [req-0444da2b-ed3c-47b6-a790-761c8c531d85 req-c3037394-9658-403e-acb3-c4898543253b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:40 np0005546909 nova_compute[187208]: 2025-12-05 12:19:40.322 187212 DEBUG nova.compute.manager [req-0444da2b-ed3c-47b6-a790-761c8c531d85 req-c3037394-9658-403e-acb3-c4898543253b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] No waiting events found dispatching network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:19:40 np0005546909 nova_compute[187208]: 2025-12-05 12:19:40.323 187212 WARNING nova.compute.manager [req-0444da2b-ed3c-47b6-a790-761c8c531d85 req-c3037394-9658-403e-acb3-c4898543253b 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received unexpected event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 for instance with vm_state active and task_state None.#033[00m
Dec  5 07:19:41 np0005546909 podman[242693]: 2025-12-05 12:19:41.217889613 +0000 UTC m=+0.067556473 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.406 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.517 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.520 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.521 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.522 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.523 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.525 187212 INFO nova.compute.manager [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Terminating instance#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.527 187212 DEBUG nova.compute.manager [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:19:41 np0005546909 kernel: tapf7a08175-a5 (unregistering): left promiscuous mode
Dec  5 07:19:41 np0005546909 NetworkManager[55691]: <info>  [1764937181.5453] device (tapf7a08175-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.552 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:41Z|01123|binding|INFO|Releasing lport f7a08175-a5c6-45b7-b194-819c5b881995 from this chassis (sb_readonly=0)
Dec  5 07:19:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:41Z|01124|binding|INFO|Setting lport f7a08175-a5c6-45b7-b194-819c5b881995 down in Southbound
Dec  5 07:19:41 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:41Z|01125|binding|INFO|Removing iface tapf7a08175-a5 ovn-installed in OVS
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.555 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.563 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:75:08 10.100.0.11'], port_security=['fa:16:3e:b6:75:08 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52916d9d-eb76-4677-8333-d02c9507adbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3463fde58c6c4bea98c82b2cb087a0dd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '21c17f9f-419d-4f4a-937b-418d08c504db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9ca61dc-1564-41a5-9908-f7ecfadeff73, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=f7a08175-a5c6-45b7-b194-819c5b881995) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.564 104471 INFO neutron.agent.ovn.metadata.agent [-] Port f7a08175-a5c6-45b7-b194-819c5b881995 in datapath 52916d9d-eb76-4677-8333-d02c9507adbc unbound from our chassis#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.565 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 52916d9d-eb76-4677-8333-d02c9507adbc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.567 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.567 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[acd7a734-343e-413c-a923-ea16f1f23b1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.567 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc namespace which is not needed anymore#033[00m
Dec  5 07:19:41 np0005546909 systemd[1]: machine-qemu\x2d132\x2dinstance\x2d0000006a.scope: Deactivated successfully.
Dec  5 07:19:41 np0005546909 systemd[1]: machine-qemu\x2d132\x2dinstance\x2d0000006a.scope: Consumed 4.034s CPU time.
Dec  5 07:19:41 np0005546909 systemd-machined[153543]: Machine qemu-132-instance-0000006a terminated.
Dec  5 07:19:41 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [NOTICE]   (242682) : haproxy version is 2.8.14-c23fe91
Dec  5 07:19:41 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [NOTICE]   (242682) : path to executable is /usr/sbin/haproxy
Dec  5 07:19:41 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [WARNING]  (242682) : Exiting Master process...
Dec  5 07:19:41 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [WARNING]  (242682) : Exiting Master process...
Dec  5 07:19:41 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [ALERT]    (242682) : Current worker (242684) exited with code 143 (Terminated)
Dec  5 07:19:41 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242674]: [WARNING]  (242682) : All workers exited. Exiting... (0)
Dec  5 07:19:41 np0005546909 systemd[1]: libpod-a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0.scope: Deactivated successfully.
Dec  5 07:19:41 np0005546909 conmon[242674]: conmon a2fc1f32fe789290236e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0.scope/container/memory.events
Dec  5 07:19:41 np0005546909 podman[242739]: 2025-12-05 12:19:41.699885253 +0000 UTC m=+0.045216745 container died a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec  5 07:19:41 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0-userdata-shm.mount: Deactivated successfully.
Dec  5 07:19:41 np0005546909 systemd[1]: var-lib-containers-storage-overlay-fe9e174fa0d1960b337c7b5c186b03bdd5ebfcd106c3c3e9d8495dbd20d710fc-merged.mount: Deactivated successfully.
Dec  5 07:19:41 np0005546909 podman[242739]: 2025-12-05 12:19:41.744513829 +0000 UTC m=+0.089845321 container cleanup a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  5 07:19:41 np0005546909 systemd[1]: libpod-conmon-a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0.scope: Deactivated successfully.
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.826 187212 INFO nova.virt.libvirt.driver [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Instance destroyed successfully.#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.828 187212 DEBUG nova.objects.instance [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lazy-loading 'resources' on Instance uuid 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.841 187212 DEBUG nova.virt.libvirt.vif [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:19:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-793938802',display_name='tempest-VolumesActionsTest-instance-793938802',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-793938802',id=106,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:19:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3463fde58c6c4bea98c82b2cb087a0dd',ramdisk_id='',reservation_id='r-6clntd3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-VolumesActionsTest-1057905007',owner_user_name='tempest-VolumesActionsTest-1057905007-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:19:38Z,user_data=None,user_id='223f7822261946cc9228b2207bd1096c',uuid=3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.842 187212 DEBUG nova.network.os_vif_util [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converting VIF {"id": "f7a08175-a5c6-45b7-b194-819c5b881995", "address": "fa:16:3e:b6:75:08", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7a08175-a5", "ovs_interfaceid": "f7a08175-a5c6-45b7-b194-819c5b881995", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.843 187212 DEBUG nova.network.os_vif_util [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.843 187212 DEBUG os_vif [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.845 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.845 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7a08175-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.846 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.848 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:41 np0005546909 podman[242774]: 2025-12-05 12:19:41.849872734 +0000 UTC m=+0.053231784 container remove a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.851 187212 INFO os_vif [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:75:08,bridge_name='br-int',has_traffic_filtering=True,id=f7a08175-a5c6-45b7-b194-819c5b881995,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7a08175-a5')#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.852 187212 INFO nova.virt.libvirt.driver [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Deleting instance files /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e_del#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.853 187212 INFO nova.virt.libvirt.driver [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Deletion of /var/lib/nova/instances/3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e_del complete#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.854 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3f56af92-06df-4899-b635-9d6263b8b83f]: (4, ('Fri Dec  5 12:19:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc (a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0)\na2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0\nFri Dec  5 12:19:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc (a2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0)\na2fc1f32fe789290236e4d6af3176f828353d7591daf03efbcabb0fe76de81f0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.855 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[482b2f5f-7d5a-4978-9c66-20d7c08c921f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.856 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52916d9d-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.858 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:41 np0005546909 kernel: tap52916d9d-e0: left promiscuous mode
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.868 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.869 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.870 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[aa4abc68-ac86-40ce-ba41-7c0319db6268]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.887 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[0d5d3070-4199-4ad3-93ed-f48d6091a2eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.889 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec273c0-22f5-4dd6-b8bc-85c61a516435]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.907 187212 INFO nova.compute.manager [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.908 187212 DEBUG oslo.service.loopingcall [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.909 187212 DEBUG nova.compute.manager [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:19:41 np0005546909 nova_compute[187208]: 2025-12-05 12:19:41.909 187212 DEBUG nova.network.neutron [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.908 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fccee749-f5ef-489f-8363-f553aa0ba029]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455810, 'reachable_time': 42002, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242798, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.912 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:19:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:41.913 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c57f4809-13ae-453c-994f-f9d4a6876437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:41 np0005546909 systemd[1]: run-netns-ovnmeta\x2d52916d9d\x2deb76\x2d4677\x2d8333\x2dd02c9507adbc.mount: Deactivated successfully.
Dec  5 07:19:42 np0005546909 nova_compute[187208]: 2025-12-05 12:19:42.572 187212 DEBUG nova.compute.manager [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-vif-unplugged-f7a08175-a5c6-45b7-b194-819c5b881995 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:19:42 np0005546909 nova_compute[187208]: 2025-12-05 12:19:42.572 187212 DEBUG oslo_concurrency.lockutils [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:42 np0005546909 nova_compute[187208]: 2025-12-05 12:19:42.573 187212 DEBUG oslo_concurrency.lockutils [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:42 np0005546909 nova_compute[187208]: 2025-12-05 12:19:42.573 187212 DEBUG oslo_concurrency.lockutils [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:42 np0005546909 nova_compute[187208]: 2025-12-05 12:19:42.574 187212 DEBUG nova.compute.manager [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] No waiting events found dispatching network-vif-unplugged-f7a08175-a5c6-45b7-b194-819c5b881995 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:19:42 np0005546909 nova_compute[187208]: 2025-12-05 12:19:42.574 187212 DEBUG nova.compute.manager [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-vif-unplugged-f7a08175-a5c6-45b7-b194-819c5b881995 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:19:42 np0005546909 nova_compute[187208]: 2025-12-05 12:19:42.575 187212 DEBUG nova.compute.manager [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:19:42 np0005546909 nova_compute[187208]: 2025-12-05 12:19:42.575 187212 DEBUG oslo_concurrency.lockutils [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:42 np0005546909 nova_compute[187208]: 2025-12-05 12:19:42.576 187212 DEBUG oslo_concurrency.lockutils [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:42 np0005546909 nova_compute[187208]: 2025-12-05 12:19:42.576 187212 DEBUG oslo_concurrency.lockutils [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:42 np0005546909 nova_compute[187208]: 2025-12-05 12:19:42.576 187212 DEBUG nova.compute.manager [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] No waiting events found dispatching network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:19:42 np0005546909 nova_compute[187208]: 2025-12-05 12:19:42.577 187212 WARNING nova.compute.manager [req-c8ad53dd-0877-474a-afbc-13b0df77c736 req-06b6a8b2-ab08-472d-97e5-51b21045ff96 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received unexpected event network-vif-plugged-f7a08175-a5c6-45b7-b194-819c5b881995 for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:19:43 np0005546909 nova_compute[187208]: 2025-12-05 12:19:43.384 187212 DEBUG nova.network.neutron [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:19:43 np0005546909 nova_compute[187208]: 2025-12-05 12:19:43.405 187212 INFO nova.compute.manager [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Took 1.50 seconds to deallocate network for instance.#033[00m
Dec  5 07:19:43 np0005546909 nova_compute[187208]: 2025-12-05 12:19:43.468 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:43 np0005546909 nova_compute[187208]: 2025-12-05 12:19:43.469 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:43 np0005546909 nova_compute[187208]: 2025-12-05 12:19:43.551 187212 DEBUG nova.compute.provider_tree [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:19:43 np0005546909 nova_compute[187208]: 2025-12-05 12:19:43.568 187212 DEBUG nova.scheduler.client.report [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:19:43 np0005546909 nova_compute[187208]: 2025-12-05 12:19:43.598 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:43 np0005546909 nova_compute[187208]: 2025-12-05 12:19:43.610 187212 DEBUG nova.compute.manager [req-459ac535-60cf-4884-b316-39f17361ac48 req-554e2935-329c-4a68-a133-4a6c5cd558cf 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Received event network-vif-deleted-f7a08175-a5c6-45b7-b194-819c5b881995 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:19:43 np0005546909 nova_compute[187208]: 2025-12-05 12:19:43.632 187212 INFO nova.scheduler.client.report [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Deleted allocations for instance 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e#033[00m
Dec  5 07:19:43 np0005546909 nova_compute[187208]: 2025-12-05 12:19:43.688 187212 DEBUG oslo_concurrency.lockutils [None req-4439f4eb-b3fb-4dcb-9762-38036a5517a3 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:44 np0005546909 nova_compute[187208]: 2025-12-05 12:19:44.520 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:45 np0005546909 podman[242800]: 2025-12-05 12:19:45.22149872 +0000 UTC m=+0.077462917 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:19:45 np0005546909 podman[242799]: 2025-12-05 12:19:45.240195745 +0000 UTC m=+0.096291266 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350)
Dec  5 07:19:46 np0005546909 nova_compute[187208]: 2025-12-05 12:19:46.848 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:47 np0005546909 nova_compute[187208]: 2025-12-05 12:19:47.941 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:47 np0005546909 nova_compute[187208]: 2025-12-05 12:19:47.941 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:47 np0005546909 nova_compute[187208]: 2025-12-05 12:19:47.965 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.043 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.044 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.050 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.051 187212 INFO nova.compute.claims [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Claim successful on node compute-0.ctlplane.example.com#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.477 187212 DEBUG nova.compute.provider_tree [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.494 187212 DEBUG nova.scheduler.client.report [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.524 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.480s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.524 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.585 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.585 187212 DEBUG nova.network.neutron [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.671 187212 INFO nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.773 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.868 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.870 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.870 187212 INFO nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Creating image(s)#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.871 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "/var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.871 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "/var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.872 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "/var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.888 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.954 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.955 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "c47971824929b7466134c539db51093d53350524" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.956 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "c47971824929b7466134c539db51093d53350524" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:48 np0005546909 nova_compute[187208]: 2025-12-05 12:19:48.967 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.003 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937174.0022178, 4cdf5703-a103-4583-9e40-a33e86b5bf04 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.004 187212 INFO nova.compute.manager [-] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.026 187212 DEBUG nova.compute.manager [None req-ddd973ba-e75d-48b8-87d4-ea05960e6a31 - - - - - -] [instance: 4cdf5703-a103-4583-9e40-a33e86b5bf04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.028 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.028 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.063 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524,backing_fmt=raw /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.065 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "c47971824929b7466134c539db51093d53350524" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.065 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.132 187212 DEBUG nova.policy [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '223f7822261946cc9228b2207bd1096c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3463fde58c6c4bea98c82b2cb087a0dd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.136 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.136 187212 DEBUG nova.virt.disk.api [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Checking if we can resize image /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.137 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.227 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.229 187212 DEBUG nova.virt.disk.api [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Cannot resize image /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.230 187212 DEBUG nova.objects.instance [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lazy-loading 'migration_context' on Instance uuid 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.251 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.252 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Ensure instance console log exists: /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.252 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.253 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.253 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:49 np0005546909 nova_compute[187208]: 2025-12-05 12:19:49.547 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:50 np0005546909 nova_compute[187208]: 2025-12-05 12:19:50.077 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:19:50 np0005546909 nova_compute[187208]: 2025-12-05 12:19:50.078 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:19:50 np0005546909 nova_compute[187208]: 2025-12-05 12:19:50.180 187212 DEBUG nova.network.neutron [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Successfully created port: ddf5ec0d-377b-480c-8991-738446cfb2db _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  5 07:19:51 np0005546909 nova_compute[187208]: 2025-12-05 12:19:51.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:19:51 np0005546909 nova_compute[187208]: 2025-12-05 12:19:51.063 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:19:51 np0005546909 nova_compute[187208]: 2025-12-05 12:19:51.063 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:19:51 np0005546909 nova_compute[187208]: 2025-12-05 12:19:51.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  5 07:19:51 np0005546909 nova_compute[187208]: 2025-12-05 12:19:51.081 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:19:51 np0005546909 nova_compute[187208]: 2025-12-05 12:19:51.081 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:19:51 np0005546909 nova_compute[187208]: 2025-12-05 12:19:51.793 187212 DEBUG nova.network.neutron [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Successfully updated port: ddf5ec0d-377b-480c-8991-738446cfb2db _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  5 07:19:51 np0005546909 nova_compute[187208]: 2025-12-05 12:19:51.811 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "refresh_cache-7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:19:51 np0005546909 nova_compute[187208]: 2025-12-05 12:19:51.812 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquired lock "refresh_cache-7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:19:51 np0005546909 nova_compute[187208]: 2025-12-05 12:19:51.812 187212 DEBUG nova.network.neutron [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  5 07:19:51 np0005546909 nova_compute[187208]: 2025-12-05 12:19:51.851 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:52 np0005546909 nova_compute[187208]: 2025-12-05 12:19:52.028 187212 DEBUG nova.compute.manager [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-changed-ddf5ec0d-377b-480c-8991-738446cfb2db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:19:52 np0005546909 nova_compute[187208]: 2025-12-05 12:19:52.029 187212 DEBUG nova.compute.manager [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Refreshing instance network info cache due to event network-changed-ddf5ec0d-377b-480c-8991-738446cfb2db. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  5 07:19:52 np0005546909 nova_compute[187208]: 2025-12-05 12:19:52.030 187212 DEBUG oslo_concurrency.lockutils [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "refresh_cache-7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  5 07:19:52 np0005546909 nova_compute[187208]: 2025-12-05 12:19:52.160 187212 DEBUG nova.network.neutron [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  5 07:19:52 np0005546909 nova_compute[187208]: 2025-12-05 12:19:52.634 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937177.6323996, 06e1cdc7-fc0d-4de0-baed-0876536b7ee1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:52 np0005546909 nova_compute[187208]: 2025-12-05 12:19:52.635 187212 INFO nova.compute.manager [-] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:19:52 np0005546909 nova_compute[187208]: 2025-12-05 12:19:52.817 187212 DEBUG nova.compute.manager [None req-957d7a77-fdff-4898-86e0-daef7da8c69c - - - - - -] [instance: 06e1cdc7-fc0d-4de0-baed-0876536b7ee1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.197 187212 DEBUG nova.network.neutron [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Updating instance_info_cache with network_info: [{"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.218 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Releasing lock "refresh_cache-7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.218 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Instance network_info: |[{"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.219 187212 DEBUG oslo_concurrency.lockutils [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquired lock "refresh_cache-7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.219 187212 DEBUG nova.network.neutron [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Refreshing network info cache for port ddf5ec0d-377b-480c-8991-738446cfb2db _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.222 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Start _get_guest_xml network_info=[{"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'encryption_format': None, 'size': 0, 'encrypted': False, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'device_type': 'disk', 'disk_bus': 'virtio', 'image_id': 'a6987852-063f-405d-a848-6b382694811e'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.226 187212 WARNING nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.231 187212 DEBUG nova.virt.libvirt.host [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.231 187212 DEBUG nova.virt.libvirt.host [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.237 187212 DEBUG nova.virt.libvirt.host [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.238 187212 DEBUG nova.virt.libvirt.host [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.238 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.239 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T11:58:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='dd6ed813-dd34-44c2-aed2-e8ae4ec0bb7f',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T11:58:04Z,direct_url=<?>,disk_format='qcow2',id=a6987852-063f-405d-a848-6b382694811e,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3df4e4eed3454c178c5281d12024579e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T11:58:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.239 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.240 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.240 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.240 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.240 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.241 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.241 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.241 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.241 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.242 187212 DEBUG nova.virt.hardware [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.245 187212 DEBUG nova.virt.libvirt.vif [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-375351270',display_name='tempest-VolumesActionsTest-instance-375351270',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-375351270',id=107,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3463fde58c6c4bea98c82b2cb087a0dd',ramdisk_id='',reservation_id='r-g6z4cse6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1057905007',owner_user_name='tempest-VolumesActionsTest-1057905007-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:19:48Z,user_data=None,user_id='223f7822261946cc9228b2207bd1096c',uuid=7eb08f99-b40c-4ba3-9b30-6cfb447ba68d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.246 187212 DEBUG nova.network.os_vif_util [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converting VIF {"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.247 187212 DEBUG nova.network.os_vif_util [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.248 187212 DEBUG nova.objects.instance [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lazy-loading 'pci_devices' on Instance uuid 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.261 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] End _get_guest_xml xml=<domain type="kvm">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  <uuid>7eb08f99-b40c-4ba3-9b30-6cfb447ba68d</uuid>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  <name>instance-0000006b</name>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  <memory>131072</memory>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  <vcpu>1</vcpu>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  <metadata>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <nova:name>tempest-VolumesActionsTest-instance-375351270</nova:name>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <nova:creationTime>2025-12-05 12:19:53</nova:creationTime>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <nova:flavor name="m1.nano">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:        <nova:memory>128</nova:memory>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:        <nova:disk>1</nova:disk>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:        <nova:swap>0</nova:swap>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:        <nova:ephemeral>0</nova:ephemeral>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:        <nova:vcpus>1</nova:vcpus>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      </nova:flavor>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <nova:owner>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:        <nova:user uuid="223f7822261946cc9228b2207bd1096c">tempest-VolumesActionsTest-1057905007-project-member</nova:user>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:        <nova:project uuid="3463fde58c6c4bea98c82b2cb087a0dd">tempest-VolumesActionsTest-1057905007</nova:project>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      </nova:owner>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <nova:root type="image" uuid="a6987852-063f-405d-a848-6b382694811e"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <nova:ports>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:        <nova:port uuid="ddf5ec0d-377b-480c-8991-738446cfb2db">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:        </nova:port>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      </nova:ports>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    </nova:instance>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  </metadata>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  <sysinfo type="smbios">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <system>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <entry name="manufacturer">RDO</entry>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <entry name="product">OpenStack Compute</entry>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <entry name="serial">7eb08f99-b40c-4ba3-9b30-6cfb447ba68d</entry>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <entry name="uuid">7eb08f99-b40c-4ba3-9b30-6cfb447ba68d</entry>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <entry name="family">Virtual Machine</entry>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    </system>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  </sysinfo>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  <os>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <boot dev="hd"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <smbios mode="sysinfo"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  </os>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  <features>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <acpi/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <apic/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <vmcoreinfo/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  </features>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  <clock offset="utc">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <timer name="pit" tickpolicy="delay"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <timer name="hpet" present="no"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  </clock>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  <cpu mode="host-model" match="exact">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <topology sockets="1" cores="1" threads="1"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  </cpu>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  <devices>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <disk type="file" device="disk">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <target dev="vda" bus="virtio"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <disk type="file" device="cdrom">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <driver name="qemu" type="raw" cache="none"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <source file="/var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.config"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <target dev="sda" bus="sata"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    </disk>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <interface type="ethernet">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <mac address="fa:16:3e:51:2f:35"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <driver name="vhost" rx_queue_size="512"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <mtu size="1442"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <target dev="tapddf5ec0d-37"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    </interface>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <serial type="pty">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <log file="/var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/console.log" append="off"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    </serial>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <video>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <model type="virtio"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    </video>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <input type="tablet" bus="usb"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <rng model="virtio">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <backend model="random">/dev/urandom</backend>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    </rng>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="pci" model="pcie-root-port"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <controller type="usb" index="0"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    <memballoon model="virtio">
Dec  5 07:19:53 np0005546909 nova_compute[187208]:      <stats period="10"/>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:    </memballoon>
Dec  5 07:19:53 np0005546909 nova_compute[187208]:  </devices>
Dec  5 07:19:53 np0005546909 nova_compute[187208]: </domain>
Dec  5 07:19:53 np0005546909 nova_compute[187208]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.262 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Preparing to wait for external event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.263 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.263 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.263 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.264 187212 DEBUG nova.virt.libvirt.vif [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-05T12:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-375351270',display_name='tempest-VolumesActionsTest-instance-375351270',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-375351270',id=107,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3463fde58c6c4bea98c82b2cb087a0dd',ramdisk_id='',reservation_id='r-g6z4cse6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1057905007',owner_user_name='tempest-VolumesActionsTest-1057905007-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T12:19:48Z,user_data=None,user_id='223f7822261946cc9228b2207bd1096c',uuid=7eb08f99-b40c-4ba3-9b30-6cfb447ba68d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.264 187212 DEBUG nova.network.os_vif_util [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converting VIF {"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.264 187212 DEBUG nova.network.os_vif_util [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.265 187212 DEBUG os_vif [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.265 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.266 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.266 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.268 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.268 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapddf5ec0d-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.269 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapddf5ec0d-37, col_values=(('external_ids', {'iface-id': 'ddf5ec0d-377b-480c-8991-738446cfb2db', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:51:2f:35', 'vm-uuid': '7eb08f99-b40c-4ba3-9b30-6cfb447ba68d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:53 np0005546909 NetworkManager[55691]: <info>  [1764937193.2716] manager: (tapddf5ec0d-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/430)
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.273 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.276 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.277 187212 INFO os_vif [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37')#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.329 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.329 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.329 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] No VIF found with MAC fa:16:3e:51:2f:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.330 187212 INFO nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Using config drive#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.853 187212 INFO nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Creating config drive at /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.config#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.858 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6mc55qvf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:53 np0005546909 nova_compute[187208]: 2025-12-05 12:19:53.987 187212 DEBUG oslo_concurrency.processutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6mc55qvf" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:54 np0005546909 kernel: tapddf5ec0d-37: entered promiscuous mode
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.049 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:54 np0005546909 NetworkManager[55691]: <info>  [1764937194.0508] manager: (tapddf5ec0d-37): new Tun device (/org/freedesktop/NetworkManager/Devices/431)
Dec  5 07:19:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:54Z|01126|binding|INFO|Claiming lport ddf5ec0d-377b-480c-8991-738446cfb2db for this chassis.
Dec  5 07:19:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:54Z|01127|binding|INFO|ddf5ec0d-377b-480c-8991-738446cfb2db: Claiming fa:16:3e:51:2f:35 10.100.0.6
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.052 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:54Z|01128|binding|INFO|Setting lport ddf5ec0d-377b-480c-8991-738446cfb2db ovn-installed in OVS
Dec  5 07:19:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:54Z|01129|binding|INFO|Setting lport ddf5ec0d-377b-480c-8991-738446cfb2db up in Southbound
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.063 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:2f:35 10.100.0.6'], port_security=['fa:16:3e:51:2f:35 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7eb08f99-b40c-4ba3-9b30-6cfb447ba68d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52916d9d-eb76-4677-8333-d02c9507adbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3463fde58c6c4bea98c82b2cb087a0dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '21c17f9f-419d-4f4a-937b-418d08c504db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9ca61dc-1564-41a5-9908-f7ecfadeff73, chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ddf5ec0d-377b-480c-8991-738446cfb2db) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.064 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.065 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ddf5ec0d-377b-480c-8991-738446cfb2db in datapath 52916d9d-eb76-4677-8333-d02c9507adbc bound to our chassis#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.067 104471 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 52916d9d-eb76-4677-8333-d02c9507adbc#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.077 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d159017d-58f6-43d2-b8ff-1aa7a51b11e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.078 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap52916d9d-e1 in ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.080 214158 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap52916d9d-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.080 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[39a579f4-b627-46e9-a29d-3c865bf27ef2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.080 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[1c638d23-d1fa-4810-8fda-0e3e4e016d89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 systemd-udevd[242875]: Network interface NamePolicy= disabled on kernel command line.
Dec  5 07:19:54 np0005546909 systemd-machined[153543]: New machine qemu-133-instance-0000006b.
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.093 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a5fa9f38-2434-42d3-908f-ca358903018d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 NetworkManager[55691]: <info>  [1764937194.1003] device (tapddf5ec0d-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  5 07:19:54 np0005546909 NetworkManager[55691]: <info>  [1764937194.1018] device (tapddf5ec0d-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  5 07:19:54 np0005546909 systemd[1]: Started Virtual Machine qemu-133-instance-0000006b.
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.113 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[60ef8f18-3f2f-457d-9d84-1f1b9146ccff]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.145 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[11fdc477-1af9-4817-840d-746553ce0619]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.150 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[729c44f6-32f0-4978-932c-75eed4daecd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 NetworkManager[55691]: <info>  [1764937194.1517] manager: (tap52916d9d-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/432)
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.178 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[fffb8cf7-fc42-4930-9e63-73ded0ef148e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.180 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[2b4494d9-f766-4e13-8e6a-c02434c854a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 NetworkManager[55691]: <info>  [1764937194.2042] device (tap52916d9d-e0): carrier: link connected
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.213 214193 DEBUG oslo.privsep.daemon [-] privsep: reply[804c074c-191c-4011-bb7a-934462bccbd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.231 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c306d941-2055-46ab-abf0-2e5d54e83a59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52916d9d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:c4:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 312], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457476, 'reachable_time': 32043, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242908, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.245 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[84bb564a-4e2a-4bb9-8197-89529a5539a9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:c462'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457476, 'tstamp': 457476}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242909, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.259 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6236ce24-09cd-4ec5-893d-73b870147080]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52916d9d-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:c4:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 312], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457476, 'reachable_time': 32043, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242910, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.291 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef487fb-f91f-45e9-b8e9-9d14773094ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.339 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[fc69905e-af17-4358-b3ad-d549d6d7cec9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.341 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52916d9d-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.341 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.341 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52916d9d-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:54 np0005546909 kernel: tap52916d9d-e0: entered promiscuous mode
Dec  5 07:19:54 np0005546909 NetworkManager[55691]: <info>  [1764937194.3452] manager: (tap52916d9d-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/433)
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.345 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap52916d9d-e0, col_values=(('external_ids', {'iface-id': 'bfd2a34a-bdd5-4486-82a8-fc55b6e1020a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:54 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:54Z|01130|binding|INFO|Releasing lport bfd2a34a-bdd5-4486-82a8-fc55b6e1020a from this chassis (sb_readonly=0)
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.346 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.362 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.363 104471 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/52916d9d-eb76-4677-8333-d02c9507adbc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/52916d9d-eb76-4677-8333-d02c9507adbc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.363 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[b670e069-f2a8-4c8e-a29a-7104058bd6a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.364 104471 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: global
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    log         /dev/log local0 debug
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    log-tag     haproxy-metadata-proxy-52916d9d-eb76-4677-8333-d02c9507adbc
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    user        root
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    group       root
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    maxconn     1024
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    pidfile     /var/lib/neutron/external/pids/52916d9d-eb76-4677-8333-d02c9507adbc.pid.haproxy
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    daemon
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: defaults
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    log global
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    mode http
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    option httplog
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    option dontlognull
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    option http-server-close
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    option forwardfor
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    retries                 3
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    timeout http-request    30s
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    timeout connect         30s
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    timeout client          32s
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    timeout server          32s
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    timeout http-keep-alive 30s
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: listen listener
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    bind 169.254.169.254:80
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    server metadata /var/lib/neutron/metadata_proxy
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]:    http-request add-header X-OVN-Network-ID 52916d9d-eb76-4677-8333-d02c9507adbc
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  5 07:19:54 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:54.365 104471 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'env', 'PROCESS_TAG=haproxy-52916d9d-eb76-4677-8333-d02c9507adbc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/52916d9d-eb76-4677-8333-d02c9507adbc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.526 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937194.5253913, 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.527 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] VM Started (Lifecycle Event)#033[00m
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.549 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.680 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.686 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937194.5256512, 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.687 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] VM Paused (Lifecycle Event)#033[00m
Dec  5 07:19:54 np0005546909 podman[242949]: 2025-12-05 12:19:54.71129209 +0000 UTC m=+0.054488620 container create a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.722 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.726 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:19:54 np0005546909 systemd[1]: Started libpod-conmon-a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d.scope.
Dec  5 07:19:54 np0005546909 nova_compute[187208]: 2025-12-05 12:19:54.754 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:19:54 np0005546909 podman[242949]: 2025-12-05 12:19:54.684510113 +0000 UTC m=+0.027706673 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  5 07:19:54 np0005546909 systemd[1]: Started libcrun container.
Dec  5 07:19:54 np0005546909 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d28657b9bcc58b5db932e3079c4da3935dab0ca3c6598d46814aaf9161b94c14/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  5 07:19:54 np0005546909 podman[242949]: 2025-12-05 12:19:54.796718993 +0000 UTC m=+0.139915553 container init a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:19:54 np0005546909 podman[242949]: 2025-12-05 12:19:54.805114414 +0000 UTC m=+0.148310954 container start a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec  5 07:19:54 np0005546909 podman[242965]: 2025-12-05 12:19:54.814381309 +0000 UTC m=+0.062066607 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:19:54 np0005546909 podman[242962]: 2025-12-05 12:19:54.824958541 +0000 UTC m=+0.074628676 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec  5 07:19:54 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [NOTICE]   (243013) : New worker (243031) forked
Dec  5 07:19:54 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [NOTICE]   (243013) : Loading success.
Dec  5 07:19:54 np0005546909 podman[242966]: 2025-12-05 12:19:54.858740117 +0000 UTC m=+0.097863160 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.310 187212 DEBUG nova.compute.manager [req-ff3332c0-bbd9-4f6d-af3c-ceecb7407b0c req-66354d69-8d56-4452-a94b-e7ae9f75f984 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.310 187212 DEBUG oslo_concurrency.lockutils [req-ff3332c0-bbd9-4f6d-af3c-ceecb7407b0c req-66354d69-8d56-4452-a94b-e7ae9f75f984 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.311 187212 DEBUG oslo_concurrency.lockutils [req-ff3332c0-bbd9-4f6d-af3c-ceecb7407b0c req-66354d69-8d56-4452-a94b-e7ae9f75f984 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.311 187212 DEBUG oslo_concurrency.lockutils [req-ff3332c0-bbd9-4f6d-af3c-ceecb7407b0c req-66354d69-8d56-4452-a94b-e7ae9f75f984 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.311 187212 DEBUG nova.compute.manager [req-ff3332c0-bbd9-4f6d-af3c-ceecb7407b0c req-66354d69-8d56-4452-a94b-e7ae9f75f984 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Processing event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.312 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.317 187212 DEBUG nova.virt.driver [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] Emitting event <LifecycleEvent: 1764937195.3173542, 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.318 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] VM Resumed (Lifecycle Event)#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.320 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.324 187212 INFO nova.virt.libvirt.driver [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Instance spawned successfully.#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.324 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.349 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.356 187212 DEBUG nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.360 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.360 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.361 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.361 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.362 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.363 187212 DEBUG nova.virt.libvirt.driver [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.383 187212 INFO nova.compute.manager [None req-a39d61de-86fa-4e02-8f94-7486e8f03861 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.412 187212 INFO nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Took 6.54 seconds to spawn the instance on the hypervisor.#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.413 187212 DEBUG nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.471 187212 INFO nova.compute.manager [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Took 7.45 seconds to build instance.#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.489 187212 DEBUG oslo_concurrency.lockutils [None req-ace6b507-fbb2-42fd-9777-1388f2109e66 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.903 187212 DEBUG nova.network.neutron [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Updated VIF entry in instance network info cache for port ddf5ec0d-377b-480c-8991-738446cfb2db. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.903 187212 DEBUG nova.network.neutron [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Updating instance_info_cache with network_info: [{"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:19:55 np0005546909 nova_compute[187208]: 2025-12-05 12:19:55.926 187212 DEBUG oslo_concurrency.lockutils [req-342a61ca-6bac-4a9b-bd02-470bb50591b5 req-56878927-bc34-40a2-b00e-c3200aaa24b8 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Releasing lock "refresh_cache-7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  5 07:19:56 np0005546909 nova_compute[187208]: 2025-12-05 12:19:56.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:19:56 np0005546909 nova_compute[187208]: 2025-12-05 12:19:56.086 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:56.087 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:19:56 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:56.089 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:19:56 np0005546909 nova_compute[187208]: 2025-12-05 12:19:56.824 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937181.8230302, 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:19:56 np0005546909 nova_compute[187208]: 2025-12-05 12:19:56.825 187212 INFO nova.compute.manager [-] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:19:56 np0005546909 nova_compute[187208]: 2025-12-05 12:19:56.846 187212 DEBUG nova.compute.manager [None req-b29b46de-d7c7-4a7c-8486-0227331da15d - - - - - -] [instance: 3f4dc92b-49fb-4d85-9eb2-1a2f97080d9e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:19:57 np0005546909 nova_compute[187208]: 2025-12-05 12:19:57.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:19:57 np0005546909 nova_compute[187208]: 2025-12-05 12:19:57.463 187212 DEBUG nova.compute.manager [req-ef4b2814-56b0-4193-82f1-4984fec55e7a req-3ff67fd4-5a10-4936-a858-b6bd1c05e52c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:19:57 np0005546909 nova_compute[187208]: 2025-12-05 12:19:57.464 187212 DEBUG oslo_concurrency.lockutils [req-ef4b2814-56b0-4193-82f1-4984fec55e7a req-3ff67fd4-5a10-4936-a858-b6bd1c05e52c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:57 np0005546909 nova_compute[187208]: 2025-12-05 12:19:57.465 187212 DEBUG oslo_concurrency.lockutils [req-ef4b2814-56b0-4193-82f1-4984fec55e7a req-3ff67fd4-5a10-4936-a858-b6bd1c05e52c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:57 np0005546909 nova_compute[187208]: 2025-12-05 12:19:57.465 187212 DEBUG oslo_concurrency.lockutils [req-ef4b2814-56b0-4193-82f1-4984fec55e7a req-3ff67fd4-5a10-4936-a858-b6bd1c05e52c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:57 np0005546909 nova_compute[187208]: 2025-12-05 12:19:57.466 187212 DEBUG nova.compute.manager [req-ef4b2814-56b0-4193-82f1-4984fec55e7a req-3ff67fd4-5a10-4936-a858-b6bd1c05e52c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] No waiting events found dispatching network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:19:57 np0005546909 nova_compute[187208]: 2025-12-05 12:19:57.466 187212 WARNING nova.compute.manager [req-ef4b2814-56b0-4193-82f1-4984fec55e7a req-3ff67fd4-5a10-4936-a858-b6bd1c05e52c 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received unexpected event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db for instance with vm_state active and task_state None.#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.087 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.088 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.162 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.222 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.223 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.282 187212 DEBUG oslo_concurrency.processutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.286 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.438 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.439 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5409MB free_disk=73.03979873657227GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.439 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.440 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.515 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Instance 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.516 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.516 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.570 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.584 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.615 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.615 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.799 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.801 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.801 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.801 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.802 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.803 187212 INFO nova.compute.manager [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Terminating instance#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.804 187212 DEBUG nova.compute.manager [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  5 07:19:58 np0005546909 kernel: tapddf5ec0d-37 (unregistering): left promiscuous mode
Dec  5 07:19:58 np0005546909 NetworkManager[55691]: <info>  [1764937198.8296] device (tapddf5ec0d-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  5 07:19:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:58Z|01131|binding|INFO|Releasing lport ddf5ec0d-377b-480c-8991-738446cfb2db from this chassis (sb_readonly=0)
Dec  5 07:19:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:58Z|01132|binding|INFO|Setting lport ddf5ec0d-377b-480c-8991-738446cfb2db down in Southbound
Dec  5 07:19:58 np0005546909 ovn_controller[95610]: 2025-12-05T12:19:58Z|01133|binding|INFO|Removing iface tapddf5ec0d-37 ovn-installed in OVS
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.840 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:58.848 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:2f:35 10.100.0.6'], port_security=['fa:16:3e:51:2f:35 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7eb08f99-b40c-4ba3-9b30-6cfb447ba68d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52916d9d-eb76-4677-8333-d02c9507adbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3463fde58c6c4bea98c82b2cb087a0dd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '21c17f9f-419d-4f4a-937b-418d08c504db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9ca61dc-1564-41a5-9908-f7ecfadeff73, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>], logical_port=ddf5ec0d-377b-480c-8991-738446cfb2db) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f189ffba6a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:19:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:58.850 104471 INFO neutron.agent.ovn.metadata.agent [-] Port ddf5ec0d-377b-480c-8991-738446cfb2db in datapath 52916d9d-eb76-4677-8333-d02c9507adbc unbound from our chassis#033[00m
Dec  5 07:19:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:58.852 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 52916d9d-eb76-4677-8333-d02c9507adbc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:19:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:58.854 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[bf133c81-ba71-44ec-82d6-1fcde3250615]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:58.855 104471 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc namespace which is not needed anymore#033[00m
Dec  5 07:19:58 np0005546909 nova_compute[187208]: 2025-12-05 12:19:58.856 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:58 np0005546909 systemd[1]: machine-qemu\x2d133\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Dec  5 07:19:58 np0005546909 systemd[1]: machine-qemu\x2d133\x2dinstance\x2d0000006b.scope: Consumed 3.954s CPU time.
Dec  5 07:19:58 np0005546909 systemd-machined[153543]: Machine qemu-133-instance-0000006b terminated.
Dec  5 07:19:58 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [NOTICE]   (243013) : haproxy version is 2.8.14-c23fe91
Dec  5 07:19:58 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [NOTICE]   (243013) : path to executable is /usr/sbin/haproxy
Dec  5 07:19:58 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [WARNING]  (243013) : Exiting Master process...
Dec  5 07:19:58 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [ALERT]    (243013) : Current worker (243031) exited with code 143 (Terminated)
Dec  5 07:19:58 np0005546909 neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc[242977]: [WARNING]  (243013) : All workers exited. Exiting... (0)
Dec  5 07:19:58 np0005546909 systemd[1]: libpod-a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d.scope: Deactivated successfully.
Dec  5 07:19:59 np0005546909 podman[243076]: 2025-12-05 12:19:59.006153018 +0000 UTC m=+0.053330267 container died a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.034 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:59 np0005546909 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d-userdata-shm.mount: Deactivated successfully.
Dec  5 07:19:59 np0005546909 systemd[1]: var-lib-containers-storage-overlay-d28657b9bcc58b5db932e3079c4da3935dab0ca3c6598d46814aaf9161b94c14-merged.mount: Deactivated successfully.
Dec  5 07:19:59 np0005546909 podman[243076]: 2025-12-05 12:19:59.054742518 +0000 UTC m=+0.101919777 container cleanup a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2)
Dec  5 07:19:59 np0005546909 systemd[1]: libpod-conmon-a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d.scope: Deactivated successfully.
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.090 187212 INFO nova.virt.libvirt.driver [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Instance destroyed successfully.#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.092 187212 DEBUG nova.objects.instance [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lazy-loading 'resources' on Instance uuid 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.106 187212 DEBUG nova.virt.libvirt.vif [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-05T12:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-375351270',display_name='tempest-VolumesActionsTest-instance-375351270',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-volumesactionstest-instance-375351270',id=107,image_ref='a6987852-063f-405d-a848-6b382694811e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T12:19:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3463fde58c6c4bea98c82b2cb087a0dd',ramdisk_id='',reservation_id='r-g6z4cse6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='a6987852-063f-405d-a848-6b382694811e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-VolumesActionsTest-1057905007',owner_user_name='tempest-VolumesActionsTest-1057905007-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T12:19:55Z,user_data=None,user_id='223f7822261946cc9228b2207bd1096c',uuid=7eb08f99-b40c-4ba3-9b30-6cfb447ba68d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.107 187212 DEBUG nova.network.os_vif_util [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converting VIF {"id": "ddf5ec0d-377b-480c-8991-738446cfb2db", "address": "fa:16:3e:51:2f:35", "network": {"id": "52916d9d-eb76-4677-8333-d02c9507adbc", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1815806297-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3463fde58c6c4bea98c82b2cb087a0dd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddf5ec0d-37", "ovs_interfaceid": "ddf5ec0d-377b-480c-8991-738446cfb2db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.108 187212 DEBUG nova.network.os_vif_util [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.109 187212 DEBUG os_vif [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.112 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 34 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.113 187212 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapddf5ec0d-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.114 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.116 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.118 187212 INFO os_vif [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:2f:35,bridge_name='br-int',has_traffic_filtering=True,id=ddf5ec0d-377b-480c-8991-738446cfb2db,network=Network(52916d9d-eb76-4677-8333-d02c9507adbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddf5ec0d-37')#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.118 187212 INFO nova.virt.libvirt.driver [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Deleting instance files /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d_del#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.119 187212 INFO nova.virt.libvirt.driver [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Deletion of /var/lib/nova/instances/7eb08f99-b40c-4ba3-9b30-6cfb447ba68d_del complete#033[00m
Dec  5 07:19:59 np0005546909 podman[243117]: 2025-12-05 12:19:59.126575353 +0000 UTC m=+0.043523666 container remove a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  5 07:19:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.131 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[9954b2a6-7800-43a5-b865-841406270ca0]: (4, ('Fri Dec  5 12:19:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc (a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d)\na9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d\nFri Dec  5 12:19:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc (a9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d)\na9fb03726f7e14beee96efb023cf085f7f5ed827826ff144397d9887fcf2161d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.133 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[20b9d631-0816-4570-826f-8f489feffdd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.134 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52916d9d-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.136 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:59 np0005546909 kernel: tap52916d9d-e0: left promiscuous mode
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.149 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.153 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[793a252d-7e57-47f7-a47e-7674d0f7530c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.170 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[a76e09e0-e9b5-4ba0-b4b9-122d3685b3e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.172 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce83d5d-20a5-4c32-a671-7d18beaedbb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.182 187212 INFO nova.compute.manager [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.182 187212 DEBUG oslo.service.loopingcall [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.182 187212 DEBUG nova.compute.manager [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.183 187212 DEBUG nova.network.neutron [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  5 07:19:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.189 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[d294dde2-3636-4c66-acf0-e64ce6282ba8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457470, 'reachable_time': 34426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243134, 'error': None, 'target': 'ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.192 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-52916d9d-eb76-4677-8333-d02c9507adbc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  5 07:19:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:19:59.192 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7ea968d1-3177-4b6f-817a-b6ee98221a3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:19:59 np0005546909 systemd[1]: run-netns-ovnmeta\x2d52916d9d\x2deb76\x2d4677\x2d8333\x2dd02c9507adbc.mount: Deactivated successfully.
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.552 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.805 187212 DEBUG nova.compute.manager [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-vif-unplugged-ddf5ec0d-377b-480c-8991-738446cfb2db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.806 187212 DEBUG oslo_concurrency.lockutils [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.806 187212 DEBUG oslo_concurrency.lockutils [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.807 187212 DEBUG oslo_concurrency.lockutils [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.807 187212 DEBUG nova.compute.manager [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] No waiting events found dispatching network-vif-unplugged-ddf5ec0d-377b-480c-8991-738446cfb2db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.807 187212 DEBUG nova.compute.manager [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-vif-unplugged-ddf5ec0d-377b-480c-8991-738446cfb2db for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.807 187212 DEBUG nova.compute.manager [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.808 187212 DEBUG oslo_concurrency.lockutils [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Acquiring lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.808 187212 DEBUG oslo_concurrency.lockutils [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.808 187212 DEBUG oslo_concurrency.lockutils [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.808 187212 DEBUG nova.compute.manager [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] No waiting events found dispatching network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  5 07:19:59 np0005546909 nova_compute[187208]: 2025-12-05 12:19:59.809 187212 WARNING nova.compute.manager [req-4dfb0c7f-8b04-4e1c-af70-305715a8dc7d req-1f9865b6-9adc-4557-8c7b-1d4fafee0a87 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received unexpected event network-vif-plugged-ddf5ec0d-377b-480c-8991-738446cfb2db for instance with vm_state active and task_state deleting.#033[00m
Dec  5 07:20:00 np0005546909 nova_compute[187208]: 2025-12-05 12:20:00.495 187212 DEBUG nova.network.neutron [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  5 07:20:00 np0005546909 nova_compute[187208]: 2025-12-05 12:20:00.512 187212 INFO nova.compute.manager [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Took 1.33 seconds to deallocate network for instance.#033[00m
Dec  5 07:20:00 np0005546909 nova_compute[187208]: 2025-12-05 12:20:00.562 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:20:00 np0005546909 nova_compute[187208]: 2025-12-05 12:20:00.562 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:20:00 np0005546909 nova_compute[187208]: 2025-12-05 12:20:00.628 187212 DEBUG nova.compute.provider_tree [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:20:00 np0005546909 nova_compute[187208]: 2025-12-05 12:20:00.653 187212 DEBUG nova.scheduler.client.report [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:20:00 np0005546909 nova_compute[187208]: 2025-12-05 12:20:00.674 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:20:00 np0005546909 nova_compute[187208]: 2025-12-05 12:20:00.712 187212 INFO nova.scheduler.client.report [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Deleted allocations for instance 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d#033[00m
Dec  5 07:20:00 np0005546909 nova_compute[187208]: 2025-12-05 12:20:00.751 187212 DEBUG nova.compute.manager [req-186f0cf4-77ef-4cfa-9de9-83f3f1d31fa5 req-936c14f6-b75c-4464-93cd-1bf8dcc13470 28ac976133b24ba1b6d4179c2f465aae 9688817a02f248cdb27666bc1359b2ba - - default default] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Received event network-vif-deleted-ddf5ec0d-377b-480c-8991-738446cfb2db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  5 07:20:00 np0005546909 nova_compute[187208]: 2025-12-05 12:20:00.970 187212 DEBUG oslo_concurrency.lockutils [None req-a80e2a63-7525-4b80-9e47-1c08566b54b1 223f7822261946cc9228b2207bd1096c 3463fde58c6c4bea98c82b2cb087a0dd - - default default] Lock "7eb08f99-b40c-4ba3-9b30-6cfb447ba68d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:20:01 np0005546909 nova_compute[187208]: 2025-12-05 12:20:01.611 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:20:02 np0005546909 nova_compute[187208]: 2025-12-05 12:20:02.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:20:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:20:03.023 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:20:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:20:03.024 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:20:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:20:03.024 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:20:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:20:04.093 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:20:04 np0005546909 nova_compute[187208]: 2025-12-05 12:20:04.117 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:04 np0005546909 nova_compute[187208]: 2025-12-05 12:20:04.553 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:06 np0005546909 podman[243135]: 2025-12-05 12:20:06.207084505 +0000 UTC m=+0.055394566 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:20:09 np0005546909 nova_compute[187208]: 2025-12-05 12:20:09.120 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:09 np0005546909 nova_compute[187208]: 2025-12-05 12:20:09.555 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:12 np0005546909 podman[243159]: 2025-12-05 12:20:12.229959671 +0000 UTC m=+0.083990164 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:20:14 np0005546909 nova_compute[187208]: 2025-12-05 12:20:14.088 187212 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764937199.0874865, 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  5 07:20:14 np0005546909 nova_compute[187208]: 2025-12-05 12:20:14.088 187212 INFO nova.compute.manager [-] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] VM Stopped (Lifecycle Event)#033[00m
Dec  5 07:20:14 np0005546909 nova_compute[187208]: 2025-12-05 12:20:14.122 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:14 np0005546909 nova_compute[187208]: 2025-12-05 12:20:14.150 187212 DEBUG nova.compute.manager [None req-a360d02f-62f0-4717-92f4-4248aa2efd85 - - - - - -] [instance: 7eb08f99-b40c-4ba3-9b30-6cfb447ba68d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  5 07:20:14 np0005546909 nova_compute[187208]: 2025-12-05 12:20:14.557 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:16 np0005546909 podman[243180]: 2025-12-05 12:20:16.202787507 +0000 UTC m=+0.059389490 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, release=1755695350, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git)
Dec  5 07:20:16 np0005546909 podman[243181]: 2025-12-05 12:20:16.220697839 +0000 UTC m=+0.072990539 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec  5 07:20:19 np0005546909 nova_compute[187208]: 2025-12-05 12:20:19.124 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:19 np0005546909 nova_compute[187208]: 2025-12-05 12:20:19.558 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:24 np0005546909 nova_compute[187208]: 2025-12-05 12:20:24.127 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:24 np0005546909 nova_compute[187208]: 2025-12-05 12:20:24.560 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:25 np0005546909 podman[243219]: 2025-12-05 12:20:25.200974739 +0000 UTC m=+0.053219053 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:20:25 np0005546909 podman[243220]: 2025-12-05 12:20:25.206994301 +0000 UTC m=+0.054495500 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:20:25 np0005546909 podman[243221]: 2025-12-05 12:20:25.233759787 +0000 UTC m=+0.076928872 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:20:29 np0005546909 nova_compute[187208]: 2025-12-05 12:20:29.131 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:29 np0005546909 nova_compute[187208]: 2025-12-05 12:20:29.561 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:34 np0005546909 nova_compute[187208]: 2025-12-05 12:20:34.135 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:34 np0005546909 nova_compute[187208]: 2025-12-05 12:20:34.561 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:20:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:20:37 np0005546909 podman[243282]: 2025-12-05 12:20:37.197914274 +0000 UTC m=+0.050129385 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:20:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:20:37.611 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:20:37 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:20:37.611 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:20:37 np0005546909 nova_compute[187208]: 2025-12-05 12:20:37.612 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:39 np0005546909 nova_compute[187208]: 2025-12-05 12:20:39.139 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:39 np0005546909 nova_compute[187208]: 2025-12-05 12:20:39.564 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:41 np0005546909 nova_compute[187208]: 2025-12-05 12:20:41.399 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:41 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:20:41.613 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:20:43 np0005546909 podman[243306]: 2025-12-05 12:20:43.197399659 +0000 UTC m=+0.051704040 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec  5 07:20:44 np0005546909 nova_compute[187208]: 2025-12-05 12:20:44.141 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:44 np0005546909 nova_compute[187208]: 2025-12-05 12:20:44.566 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:47 np0005546909 podman[243328]: 2025-12-05 12:20:47.205837124 +0000 UTC m=+0.054730197 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  5 07:20:47 np0005546909 podman[243327]: 2025-12-05 12:20:47.208767908 +0000 UTC m=+0.060204973 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public)
Dec  5 07:20:49 np0005546909 nova_compute[187208]: 2025-12-05 12:20:49.144 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:49 np0005546909 nova_compute[187208]: 2025-12-05 12:20:49.568 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:51 np0005546909 nova_compute[187208]: 2025-12-05 12:20:51.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:20:51 np0005546909 nova_compute[187208]: 2025-12-05 12:20:51.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:20:51 np0005546909 nova_compute[187208]: 2025-12-05 12:20:51.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:20:51 np0005546909 nova_compute[187208]: 2025-12-05 12:20:51.077 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:20:51 np0005546909 nova_compute[187208]: 2025-12-05 12:20:51.077 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:20:51 np0005546909 nova_compute[187208]: 2025-12-05 12:20:51.078 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:20:53 np0005546909 nova_compute[187208]: 2025-12-05 12:20:53.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:20:54 np0005546909 nova_compute[187208]: 2025-12-05 12:20:54.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:20:54 np0005546909 nova_compute[187208]: 2025-12-05 12:20:54.148 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:54 np0005546909 nova_compute[187208]: 2025-12-05 12:20:54.570 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:55 np0005546909 nova_compute[187208]: 2025-12-05 12:20:55.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:20:56 np0005546909 podman[243363]: 2025-12-05 12:20:56.19782976 +0000 UTC m=+0.056385954 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:20:56 np0005546909 podman[243364]: 2025-12-05 12:20:56.205473149 +0000 UTC m=+0.057076824 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:20:56 np0005546909 podman[243365]: 2025-12-05 12:20:56.241214011 +0000 UTC m=+0.092237279 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec  5 07:20:57 np0005546909 nova_compute[187208]: 2025-12-05 12:20:57.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.099 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.100 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.100 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.100 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.300 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.301 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5642MB free_disk=73.0406265258789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.301 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.302 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.401 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.402 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.456 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.471 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.505 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:20:58 np0005546909 nova_compute[187208]: 2025-12-05 12:20:58.506 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:20:59 np0005546909 nova_compute[187208]: 2025-12-05 12:20:59.151 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:20:59 np0005546909 nova_compute[187208]: 2025-12-05 12:20:59.573 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:02 np0005546909 nova_compute[187208]: 2025-12-05 12:21:02.501 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:21:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:21:03.024 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:21:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:21:03.025 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:21:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:21:03.025 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:21:04 np0005546909 nova_compute[187208]: 2025-12-05 12:21:04.154 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:04 np0005546909 nova_compute[187208]: 2025-12-05 12:21:04.575 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:08 np0005546909 podman[243429]: 2025-12-05 12:21:08.192776778 +0000 UTC m=+0.049020884 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:21:09 np0005546909 nova_compute[187208]: 2025-12-05 12:21:09.158 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:09 np0005546909 nova_compute[187208]: 2025-12-05 12:21:09.577 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:12 np0005546909 ovn_controller[95610]: 2025-12-05T12:21:12Z|01134|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec  5 07:21:14 np0005546909 nova_compute[187208]: 2025-12-05 12:21:14.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:14 np0005546909 podman[243454]: 2025-12-05 12:21:14.221105859 +0000 UTC m=+0.067801101 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute)
Dec  5 07:21:14 np0005546909 nova_compute[187208]: 2025-12-05 12:21:14.579 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:18 np0005546909 podman[243476]: 2025-12-05 12:21:18.197141806 +0000 UTC m=+0.047589572 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 07:21:18 np0005546909 podman[243475]: 2025-12-05 12:21:18.204694192 +0000 UTC m=+0.057813395 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 07:21:19 np0005546909 nova_compute[187208]: 2025-12-05 12:21:19.165 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:19 np0005546909 nova_compute[187208]: 2025-12-05 12:21:19.581 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:24 np0005546909 nova_compute[187208]: 2025-12-05 12:21:24.168 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:24 np0005546909 nova_compute[187208]: 2025-12-05 12:21:24.583 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:27 np0005546909 podman[243519]: 2025-12-05 12:21:27.208639132 +0000 UTC m=+0.051207256 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec  5 07:21:27 np0005546909 podman[243518]: 2025-12-05 12:21:27.208364824 +0000 UTC m=+0.059269686 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:21:27 np0005546909 podman[243525]: 2025-12-05 12:21:27.265810668 +0000 UTC m=+0.095152603 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:21:29 np0005546909 nova_compute[187208]: 2025-12-05 12:21:29.171 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:29 np0005546909 nova_compute[187208]: 2025-12-05 12:21:29.584 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:34 np0005546909 nova_compute[187208]: 2025-12-05 12:21:34.175 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:34 np0005546909 nova_compute[187208]: 2025-12-05 12:21:34.586 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:39 np0005546909 nova_compute[187208]: 2025-12-05 12:21:39.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:39 np0005546909 podman[243588]: 2025-12-05 12:21:39.213941787 +0000 UTC m=+0.057222302 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:21:39 np0005546909 nova_compute[187208]: 2025-12-05 12:21:39.587 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:44 np0005546909 nova_compute[187208]: 2025-12-05 12:21:44.181 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:44 np0005546909 nova_compute[187208]: 2025-12-05 12:21:44.588 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:45 np0005546909 podman[243614]: 2025-12-05 12:21:45.195008037 +0000 UTC m=+0.053047863 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm)
Dec  5 07:21:49 np0005546909 nova_compute[187208]: 2025-12-05 12:21:49.185 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:49 np0005546909 podman[243634]: 2025-12-05 12:21:49.207576076 +0000 UTC m=+0.057256453 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Dec  5 07:21:49 np0005546909 podman[243635]: 2025-12-05 12:21:49.234719715 +0000 UTC m=+0.079401719 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Dec  5 07:21:49 np0005546909 nova_compute[187208]: 2025-12-05 12:21:49.590 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:52 np0005546909 nova_compute[187208]: 2025-12-05 12:21:52.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:21:52 np0005546909 nova_compute[187208]: 2025-12-05 12:21:52.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:21:52 np0005546909 nova_compute[187208]: 2025-12-05 12:21:52.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:21:52 np0005546909 nova_compute[187208]: 2025-12-05 12:21:52.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:21:53 np0005546909 nova_compute[187208]: 2025-12-05 12:21:53.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:21:53 np0005546909 nova_compute[187208]: 2025-12-05 12:21:53.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:21:54 np0005546909 nova_compute[187208]: 2025-12-05 12:21:54.187 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:54 np0005546909 nova_compute[187208]: 2025-12-05 12:21:54.593 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:55 np0005546909 nova_compute[187208]: 2025-12-05 12:21:55.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:21:55 np0005546909 nova_compute[187208]: 2025-12-05 12:21:55.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:21:55 np0005546909 nova_compute[187208]: 2025-12-05 12:21:55.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:21:58 np0005546909 nova_compute[187208]: 2025-12-05 12:21:58.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:21:58 np0005546909 podman[243674]: 2025-12-05 12:21:58.201291709 +0000 UTC m=+0.058456998 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:21:58 np0005546909 podman[243675]: 2025-12-05 12:21:58.220645084 +0000 UTC m=+0.074918030 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:21:58 np0005546909 nova_compute[187208]: 2025-12-05 12:21:58.232 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:21:58.232 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:21:58 np0005546909 podman[243676]: 2025-12-05 12:21:58.234216524 +0000 UTC m=+0.083751864 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:21:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:21:58.234 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.089 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.188 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.230 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.231 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5675MB free_disk=73.04093551635742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.231 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.231 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:21:59 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:21:59.236 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.311 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.312 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.337 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.362 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.363 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.363 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:21:59 np0005546909 nova_compute[187208]: 2025-12-05 12:21:59.596 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:00 np0005546909 nova_compute[187208]: 2025-12-05 12:22:00.364 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:22:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:02.908 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:22:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:02.909 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated#033[00m
Dec  5 07:22:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:02.910 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:22:02 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:02.912 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[ba4cec93-7abc-47fa-bf8e-f3a3f045eb29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:22:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:03.025 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:22:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:03.027 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:22:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:03.027 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:22:03 np0005546909 nova_compute[187208]: 2025-12-05 12:22:03.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:22:04 np0005546909 nova_compute[187208]: 2025-12-05 12:22:04.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:22:04 np0005546909 nova_compute[187208]: 2025-12-05 12:22:04.192 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:04 np0005546909 nova_compute[187208]: 2025-12-05 12:22:04.598 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:07.684 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 2001:db8::f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:22:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:07.686 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated#033[00m
Dec  5 07:22:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:07.688 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:22:07 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:07.689 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[93606769-1c4b-4d13-9f09-5dd0632681da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:22:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:08.997 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:22:08 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:08.999 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated#033[00m
Dec  5 07:22:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:09.001 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:22:09 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:09.002 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5c4b4418-e7a8-40e9-a977-752ca4b57ec7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:22:09 np0005546909 nova_compute[187208]: 2025-12-05 12:22:09.195 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:09 np0005546909 nova_compute[187208]: 2025-12-05 12:22:09.599 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:10 np0005546909 podman[243743]: 2025-12-05 12:22:10.200224824 +0000 UTC m=+0.051726815 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:22:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:12.816 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:22:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:12.817 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated#033[00m
Dec  5 07:22:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:12.818 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:22:12 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:12.819 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[761caaf4-fd6a-46f8-87ba-c3e0feda4afd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:22:14 np0005546909 nova_compute[187208]: 2025-12-05 12:22:14.198 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:14.331 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:22:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:14.332 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated#033[00m
Dec  5 07:22:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:14.333 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:22:14 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:14.334 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[5c8f409b-b6f1-42ad-965c-2afd5765d1d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:22:14 np0005546909 nova_compute[187208]: 2025-12-05 12:22:14.601 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:16 np0005546909 podman[243767]: 2025-12-05 12:22:16.222589017 +0000 UTC m=+0.071013228 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec  5 07:22:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:18.872 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:22:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:18.873 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated#033[00m
Dec  5 07:22:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:18.874 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:22:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:18.875 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[6c7ac410-11e3-4643-b09c-fb918b5edaf9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:22:19 np0005546909 nova_compute[187208]: 2025-12-05 12:22:19.202 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:19 np0005546909 nova_compute[187208]: 2025-12-05 12:22:19.604 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:19.984 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:22:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:19.985 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated#033[00m
Dec  5 07:22:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:19.986 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:22:19 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:19.987 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[2e7b6ba2-e6c0-4290-b59f-a9d08a96a071]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:22:20 np0005546909 podman[243789]: 2025-12-05 12:22:20.204325332 +0000 UTC m=+0.054113304 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:22:20 np0005546909 podman[243788]: 2025-12-05 12:22:20.233819288 +0000 UTC m=+0.087278025 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Dec  5 07:22:24 np0005546909 nova_compute[187208]: 2025-12-05 12:22:24.205 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:24 np0005546909 nova_compute[187208]: 2025-12-05 12:22:24.605 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:27.344 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 2001:db8::f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 10.100.0.2 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:22:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:27.345 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated#033[00m
Dec  5 07:22:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:27.346 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:22:27 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:27.347 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[4d256603-6c7a-443c-a18b-89fe6eaba5f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:22:29 np0005546909 podman[243828]: 2025-12-05 12:22:29.207091654 +0000 UTC m=+0.053461775 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:22:29 np0005546909 nova_compute[187208]: 2025-12-05 12:22:29.208 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:29 np0005546909 podman[243827]: 2025-12-05 12:22:29.235269722 +0000 UTC m=+0.086756010 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec  5 07:22:29 np0005546909 podman[243829]: 2025-12-05 12:22:29.247967387 +0000 UTC m=+0.088759968 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller)
Dec  5 07:22:29 np0005546909 nova_compute[187208]: 2025-12-05 12:22:29.608 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:34 np0005546909 nova_compute[187208]: 2025-12-05 12:22:34.212 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:34 np0005546909 nova_compute[187208]: 2025-12-05 12:22:34.610 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:22:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:22:39 np0005546909 nova_compute[187208]: 2025-12-05 12:22:39.215 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:39 np0005546909 nova_compute[187208]: 2025-12-05 12:22:39.611 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:41 np0005546909 podman[243892]: 2025-12-05 12:22:41.190134974 +0000 UTC m=+0.048786590 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:22:44 np0005546909 nova_compute[187208]: 2025-12-05 12:22:44.218 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:44 np0005546909 nova_compute[187208]: 2025-12-05 12:22:44.612 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:47 np0005546909 podman[243917]: 2025-12-05 12:22:47.209887713 +0000 UTC m=+0.062234396 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 07:22:49 np0005546909 nova_compute[187208]: 2025-12-05 12:22:49.222 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:49 np0005546909 nova_compute[187208]: 2025-12-05 12:22:49.614 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:51 np0005546909 podman[243938]: 2025-12-05 12:22:51.197885838 +0000 UTC m=+0.051191220 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:22:51 np0005546909 podman[243937]: 2025-12-05 12:22:51.20528683 +0000 UTC m=+0.061337630 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Dec  5 07:22:53 np0005546909 nova_compute[187208]: 2025-12-05 12:22:53.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:22:53 np0005546909 nova_compute[187208]: 2025-12-05 12:22:53.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:22:53 np0005546909 nova_compute[187208]: 2025-12-05 12:22:53.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:22:53 np0005546909 nova_compute[187208]: 2025-12-05 12:22:53.346 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:22:54 np0005546909 nova_compute[187208]: 2025-12-05 12:22:54.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:22:54 np0005546909 nova_compute[187208]: 2025-12-05 12:22:54.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:22:54 np0005546909 nova_compute[187208]: 2025-12-05 12:22:54.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:54 np0005546909 nova_compute[187208]: 2025-12-05 12:22:54.616 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:55 np0005546909 nova_compute[187208]: 2025-12-05 12:22:55.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:22:55 np0005546909 nova_compute[187208]: 2025-12-05 12:22:55.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:22:57 np0005546909 nova_compute[187208]: 2025-12-05 12:22:57.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:22:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:57.069 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:c4:1b 2001:db8:0:1:f816:3eff:fe91:c41b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8feba42e-782a-453d-87a8-7ecce5cb9d21, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fdb7a08e-9e07-4af0-87a6-67969e26ddc5) old=Port_Binding(mac=['fa:16:3e:91:c4:1b 2001:db8::f816:3eff:fe91:c41b'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe91:c41b/64', 'neutron:device_id': 'ovnmeta-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c310d2c-ed63-425b-b049-e294b3183fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28ce44d5f7f9452cb61d4dc2fe27d0c5', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:22:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:57.071 104471 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fdb7a08e-9e07-4af0-87a6-67969e26ddc5 in datapath 4c310d2c-ed63-425b-b049-e294b3183fee updated#033[00m
Dec  5 07:22:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:57.072 104471 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c310d2c-ed63-425b-b049-e294b3183fee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  5 07:22:57 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:57.073 214158 DEBUG oslo.privsep.daemon [-] privsep: reply[c04cac78-9ed0-4948-861d-e2824475cabd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  5 07:22:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:58.308 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:22:58 np0005546909 nova_compute[187208]: 2025-12-05 12:22:58.309 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:22:58.309 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:22:59 np0005546909 nova_compute[187208]: 2025-12-05 12:22:59.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:22:59 np0005546909 nova_compute[187208]: 2025-12-05 12:22:59.232 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:22:59 np0005546909 nova_compute[187208]: 2025-12-05 12:22:59.618 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:00 np0005546909 nova_compute[187208]: 2025-12-05 12:23:00.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:23:00 np0005546909 podman[243976]: 2025-12-05 12:23:00.197872233 +0000 UTC m=+0.051226470 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 07:23:00 np0005546909 podman[243975]: 2025-12-05 12:23:00.197847412 +0000 UTC m=+0.054843544 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec  5 07:23:00 np0005546909 podman[243977]: 2025-12-05 12:23:00.229111609 +0000 UTC m=+0.078811472 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller)
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.096 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.096 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.097 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.287 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.288 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5688MB free_disk=73.04093551635742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.288 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.288 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.351 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.352 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.389 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.404 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.405 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:23:01 np0005546909 nova_compute[187208]: 2025-12-05 12:23:01.406 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:23:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:23:03.027 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:23:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:23:03.027 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:23:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:23:03.027 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:23:04 np0005546909 nova_compute[187208]: 2025-12-05 12:23:04.236 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:04 np0005546909 nova_compute[187208]: 2025-12-05 12:23:04.401 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:23:04 np0005546909 nova_compute[187208]: 2025-12-05 12:23:04.620 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:06 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:23:06.311 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:23:09 np0005546909 nova_compute[187208]: 2025-12-05 12:23:09.239 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:09 np0005546909 nova_compute[187208]: 2025-12-05 12:23:09.624 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:12 np0005546909 podman[244044]: 2025-12-05 12:23:12.199944629 +0000 UTC m=+0.053093615 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:23:14 np0005546909 nova_compute[187208]: 2025-12-05 12:23:14.242 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:14 np0005546909 nova_compute[187208]: 2025-12-05 12:23:14.625 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:18 np0005546909 podman[244069]: 2025-12-05 12:23:18.207927023 +0000 UTC m=+0.063598186 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:23:19 np0005546909 nova_compute[187208]: 2025-12-05 12:23:19.246 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:19 np0005546909 nova_compute[187208]: 2025-12-05 12:23:19.626 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:22 np0005546909 podman[244092]: 2025-12-05 12:23:22.196263728 +0000 UTC m=+0.049708647 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:23:22 np0005546909 podman[244091]: 2025-12-05 12:23:22.196282509 +0000 UTC m=+0.052999322 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Dec  5 07:23:24 np0005546909 nova_compute[187208]: 2025-12-05 12:23:24.250 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:24 np0005546909 nova_compute[187208]: 2025-12-05 12:23:24.628 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:29 np0005546909 nova_compute[187208]: 2025-12-05 12:23:29.418 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:29 np0005546909 nova_compute[187208]: 2025-12-05 12:23:29.629 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:31 np0005546909 podman[244131]: 2025-12-05 12:23:31.216918154 +0000 UTC m=+0.060997510 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:23:31 np0005546909 podman[244132]: 2025-12-05 12:23:31.233476909 +0000 UTC m=+0.075241199 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125)
Dec  5 07:23:31 np0005546909 podman[244130]: 2025-12-05 12:23:31.234327814 +0000 UTC m=+0.084650450 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:23:34 np0005546909 nova_compute[187208]: 2025-12-05 12:23:34.422 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:34 np0005546909 nova_compute[187208]: 2025-12-05 12:23:34.632 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:39 np0005546909 nova_compute[187208]: 2025-12-05 12:23:39.425 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:39 np0005546909 nova_compute[187208]: 2025-12-05 12:23:39.634 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:41 np0005546909 nova_compute[187208]: 2025-12-05 12:23:41.099 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:23:43 np0005546909 podman[244204]: 2025-12-05 12:23:43.192921821 +0000 UTC m=+0.051384005 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:23:44 np0005546909 nova_compute[187208]: 2025-12-05 12:23:44.429 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:44 np0005546909 nova_compute[187208]: 2025-12-05 12:23:44.636 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:49 np0005546909 podman[244228]: 2025-12-05 12:23:49.21123089 +0000 UTC m=+0.060133646 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:23:49 np0005546909 nova_compute[187208]: 2025-12-05 12:23:49.432 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:49 np0005546909 nova_compute[187208]: 2025-12-05 12:23:49.637 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:53 np0005546909 podman[244249]: 2025-12-05 12:23:53.202908031 +0000 UTC m=+0.047948906 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  5 07:23:53 np0005546909 podman[244248]: 2025-12-05 12:23:53.211660842 +0000 UTC m=+0.058684664 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64)
Dec  5 07:23:54 np0005546909 nova_compute[187208]: 2025-12-05 12:23:54.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:23:54 np0005546909 nova_compute[187208]: 2025-12-05 12:23:54.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:23:54 np0005546909 nova_compute[187208]: 2025-12-05 12:23:54.063 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:23:54 np0005546909 nova_compute[187208]: 2025-12-05 12:23:54.090 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:23:54 np0005546909 nova_compute[187208]: 2025-12-05 12:23:54.090 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:23:54 np0005546909 nova_compute[187208]: 2025-12-05 12:23:54.090 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:23:54 np0005546909 nova_compute[187208]: 2025-12-05 12:23:54.436 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:54 np0005546909 nova_compute[187208]: 2025-12-05 12:23:54.639 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:56 np0005546909 nova_compute[187208]: 2025-12-05 12:23:56.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:23:56 np0005546909 nova_compute[187208]: 2025-12-05 12:23:56.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:23:57 np0005546909 nova_compute[187208]: 2025-12-05 12:23:57.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:23:57 np0005546909 nova_compute[187208]: 2025-12-05 12:23:57.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 07:23:57 np0005546909 nova_compute[187208]: 2025-12-05 12:23:57.085 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:23:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:23:58.453 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:23:58 np0005546909 nova_compute[187208]: 2025-12-05 12:23:58.454 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:58 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:23:58.454 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:23:59 np0005546909 nova_compute[187208]: 2025-12-05 12:23:59.102 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:23:59 np0005546909 nova_compute[187208]: 2025-12-05 12:23:59.438 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:23:59 np0005546909 nova_compute[187208]: 2025-12-05 12:23:59.641 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:00 np0005546909 nova_compute[187208]: 2025-12-05 12:24:00.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:24:01 np0005546909 nova_compute[187208]: 2025-12-05 12:24:01.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:24:01 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:24:01.456 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:24:02 np0005546909 podman[244288]: 2025-12-05 12:24:02.216874517 +0000 UTC m=+0.059828848 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:24:02 np0005546909 podman[244287]: 2025-12-05 12:24:02.226502553 +0000 UTC m=+0.065928412 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:24:02 np0005546909 podman[244289]: 2025-12-05 12:24:02.257794591 +0000 UTC m=+0.091320751 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec  5 07:24:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:24:03.028 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:24:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:24:03.028 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:24:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:24:03.029 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.089 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.253 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.254 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5695MB free_disk=73.03999328613281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.254 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.254 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.482 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.482 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.559 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.571 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.572 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:24:03 np0005546909 nova_compute[187208]: 2025-12-05 12:24:03.573 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:24:04 np0005546909 nova_compute[187208]: 2025-12-05 12:24:04.442 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:04 np0005546909 nova_compute[187208]: 2025-12-05 12:24:04.643 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:05 np0005546909 nova_compute[187208]: 2025-12-05 12:24:05.567 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:24:05 np0005546909 nova_compute[187208]: 2025-12-05 12:24:05.568 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:24:07 np0005546909 nova_compute[187208]: 2025-12-05 12:24:07.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:24:07 np0005546909 nova_compute[187208]: 2025-12-05 12:24:07.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 07:24:07 np0005546909 nova_compute[187208]: 2025-12-05 12:24:07.080 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 07:24:09 np0005546909 nova_compute[187208]: 2025-12-05 12:24:09.446 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:09 np0005546909 nova_compute[187208]: 2025-12-05 12:24:09.646 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:14 np0005546909 podman[244358]: 2025-12-05 12:24:14.209783038 +0000 UTC m=+0.052107065 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:24:14 np0005546909 nova_compute[187208]: 2025-12-05 12:24:14.449 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:14 np0005546909 nova_compute[187208]: 2025-12-05 12:24:14.648 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:19 np0005546909 nova_compute[187208]: 2025-12-05 12:24:19.096 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:24:19 np0005546909 nova_compute[187208]: 2025-12-05 12:24:19.453 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:19 np0005546909 nova_compute[187208]: 2025-12-05 12:24:19.649 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:20 np0005546909 podman[244382]: 2025-12-05 12:24:20.221867138 +0000 UTC m=+0.075780875 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:24:24 np0005546909 podman[244402]: 2025-12-05 12:24:24.201922054 +0000 UTC m=+0.055950416 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9)
Dec  5 07:24:24 np0005546909 podman[244403]: 2025-12-05 12:24:24.202782869 +0000 UTC m=+0.052101015 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec  5 07:24:24 np0005546909 nova_compute[187208]: 2025-12-05 12:24:24.495 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:24 np0005546909 nova_compute[187208]: 2025-12-05 12:24:24.651 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:29 np0005546909 nova_compute[187208]: 2025-12-05 12:24:29.543 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:29 np0005546909 nova_compute[187208]: 2025-12-05 12:24:29.653 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:33 np0005546909 podman[244441]: 2025-12-05 12:24:33.198930552 +0000 UTC m=+0.052880979 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  5 07:24:33 np0005546909 podman[244442]: 2025-12-05 12:24:33.198930452 +0000 UTC m=+0.049690307 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec  5 07:24:33 np0005546909 podman[244443]: 2025-12-05 12:24:33.226850943 +0000 UTC m=+0.075419665 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:24:34 np0005546909 nova_compute[187208]: 2025-12-05 12:24:34.545 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:34 np0005546909 nova_compute[187208]: 2025-12-05 12:24:34.654 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:24:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:24:39 np0005546909 nova_compute[187208]: 2025-12-05 12:24:39.549 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:39 np0005546909 nova_compute[187208]: 2025-12-05 12:24:39.656 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:44 np0005546909 nova_compute[187208]: 2025-12-05 12:24:44.553 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:44 np0005546909 podman[244511]: 2025-12-05 12:24:44.633178757 +0000 UTC m=+0.055749140 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:24:44 np0005546909 nova_compute[187208]: 2025-12-05 12:24:44.658 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:49 np0005546909 nova_compute[187208]: 2025-12-05 12:24:49.556 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:49 np0005546909 nova_compute[187208]: 2025-12-05 12:24:49.660 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:51 np0005546909 podman[244535]: 2025-12-05 12:24:51.19511435 +0000 UTC m=+0.053228498 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec  5 07:24:54 np0005546909 nova_compute[187208]: 2025-12-05 12:24:54.085 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:24:54 np0005546909 nova_compute[187208]: 2025-12-05 12:24:54.085 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:24:54 np0005546909 nova_compute[187208]: 2025-12-05 12:24:54.085 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:24:54 np0005546909 nova_compute[187208]: 2025-12-05 12:24:54.136 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:24:54 np0005546909 nova_compute[187208]: 2025-12-05 12:24:54.558 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:54 np0005546909 nova_compute[187208]: 2025-12-05 12:24:54.661 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:55 np0005546909 nova_compute[187208]: 2025-12-05 12:24:55.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:24:55 np0005546909 nova_compute[187208]: 2025-12-05 12:24:55.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:24:55 np0005546909 podman[244557]: 2025-12-05 12:24:55.199829014 +0000 UTC m=+0.046471514 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec  5 07:24:55 np0005546909 podman[244556]: 2025-12-05 12:24:55.230912506 +0000 UTC m=+0.083659151 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, architecture=x86_64)
Dec  5 07:24:57 np0005546909 nova_compute[187208]: 2025-12-05 12:24:57.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:24:58 np0005546909 nova_compute[187208]: 2025-12-05 12:24:58.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:24:59 np0005546909 nova_compute[187208]: 2025-12-05 12:24:59.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:24:59 np0005546909 nova_compute[187208]: 2025-12-05 12:24:59.561 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:24:59 np0005546909 nova_compute[187208]: 2025-12-05 12:24:59.663 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:01 np0005546909 nova_compute[187208]: 2025-12-05 12:25:01.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:25:02 np0005546909 nova_compute[187208]: 2025-12-05 12:25:02.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:25:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:25:03.029 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:25:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:25:03.030 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:25:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:25:03.031 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:25:04 np0005546909 podman[244597]: 2025-12-05 12:25:04.200245189 +0000 UTC m=+0.046565877 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.205 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.206 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.206 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.206 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:25:04 np0005546909 podman[244596]: 2025-12-05 12:25:04.238246619 +0000 UTC m=+0.089493748 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec  5 07:25:04 np0005546909 podman[244598]: 2025-12-05 12:25:04.268083535 +0000 UTC m=+0.112027805 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.358 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.359 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5684MB free_disk=73.03991317749023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.360 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.360 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.469 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.470 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.562 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.639 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.656 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.656 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.663 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.668 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.687 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.708 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.749 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.751 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.751 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:25:04 np0005546909 nova_compute[187208]: 2025-12-05 12:25:04.751 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:25:04.751 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:25:04 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:25:04.752 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:25:05 np0005546909 nova_compute[187208]: 2025-12-05 12:25:05.747 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:25:09 np0005546909 nova_compute[187208]: 2025-12-05 12:25:09.614 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:09 np0005546909 nova_compute[187208]: 2025-12-05 12:25:09.665 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:10 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:25:10.754 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:25:14 np0005546909 nova_compute[187208]: 2025-12-05 12:25:14.618 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:14 np0005546909 nova_compute[187208]: 2025-12-05 12:25:14.667 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:15 np0005546909 podman[244665]: 2025-12-05 12:25:15.194022978 +0000 UTC m=+0.052718493 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:25:19 np0005546909 nova_compute[187208]: 2025-12-05 12:25:19.621 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:19 np0005546909 nova_compute[187208]: 2025-12-05 12:25:19.669 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:22 np0005546909 podman[244695]: 2025-12-05 12:25:22.194822611 +0000 UTC m=+0.048496792 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:25:24 np0005546909 nova_compute[187208]: 2025-12-05 12:25:24.632 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:24 np0005546909 nova_compute[187208]: 2025-12-05 12:25:24.671 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:26 np0005546909 podman[244721]: 2025-12-05 12:25:26.214283648 +0000 UTC m=+0.060659342 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Dec  5 07:25:26 np0005546909 podman[244722]: 2025-12-05 12:25:26.22725775 +0000 UTC m=+0.075246900 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  5 07:25:29 np0005546909 nova_compute[187208]: 2025-12-05 12:25:29.636 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:29 np0005546909 nova_compute[187208]: 2025-12-05 12:25:29.671 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:34 np0005546909 nova_compute[187208]: 2025-12-05 12:25:34.638 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:34 np0005546909 nova_compute[187208]: 2025-12-05 12:25:34.672 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:35 np0005546909 podman[244761]: 2025-12-05 12:25:35.198160839 +0000 UTC m=+0.054007320 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec  5 07:25:35 np0005546909 podman[244762]: 2025-12-05 12:25:35.204857082 +0000 UTC m=+0.055383420 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:25:35 np0005546909 podman[244763]: 2025-12-05 12:25:35.227721808 +0000 UTC m=+0.076182457 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:25:39 np0005546909 nova_compute[187208]: 2025-12-05 12:25:39.641 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:39 np0005546909 nova_compute[187208]: 2025-12-05 12:25:39.673 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:44 np0005546909 nova_compute[187208]: 2025-12-05 12:25:44.645 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:44 np0005546909 nova_compute[187208]: 2025-12-05 12:25:44.675 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:46 np0005546909 podman[244827]: 2025-12-05 12:25:46.200985889 +0000 UTC m=+0.048510373 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:25:49 np0005546909 nova_compute[187208]: 2025-12-05 12:25:49.649 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:49 np0005546909 nova_compute[187208]: 2025-12-05 12:25:49.677 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:53 np0005546909 podman[244851]: 2025-12-05 12:25:53.19716427 +0000 UTC m=+0.046537066 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:25:54 np0005546909 nova_compute[187208]: 2025-12-05 12:25:54.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:25:54 np0005546909 nova_compute[187208]: 2025-12-05 12:25:54.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:25:54 np0005546909 nova_compute[187208]: 2025-12-05 12:25:54.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:25:54 np0005546909 nova_compute[187208]: 2025-12-05 12:25:54.076 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:25:54 np0005546909 nova_compute[187208]: 2025-12-05 12:25:54.654 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:54 np0005546909 nova_compute[187208]: 2025-12-05 12:25:54.679 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:55 np0005546909 nova_compute[187208]: 2025-12-05 12:25:55.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:25:55 np0005546909 nova_compute[187208]: 2025-12-05 12:25:55.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:25:57 np0005546909 nova_compute[187208]: 2025-12-05 12:25:57.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:25:57 np0005546909 podman[244871]: 2025-12-05 12:25:57.213648904 +0000 UTC m=+0.064142080 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Dec  5 07:25:57 np0005546909 podman[244872]: 2025-12-05 12:25:57.231183949 +0000 UTC m=+0.074143788 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:25:59 np0005546909 nova_compute[187208]: 2025-12-05 12:25:59.657 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:25:59 np0005546909 nova_compute[187208]: 2025-12-05 12:25:59.680 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:00 np0005546909 nova_compute[187208]: 2025-12-05 12:26:00.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:26:01 np0005546909 nova_compute[187208]: 2025-12-05 12:26:01.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:26:02 np0005546909 nova_compute[187208]: 2025-12-05 12:26:02.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:26:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:26:03.031 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:26:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:26:03.032 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:26:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:26:03.032 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:26:03 np0005546909 nova_compute[187208]: 2025-12-05 12:26:03.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:26:04 np0005546909 nova_compute[187208]: 2025-12-05 12:26:04.660 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:04 np0005546909 nova_compute[187208]: 2025-12-05 12:26:04.681 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.097 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.258 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.259 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5707MB free_disk=73.04000854492188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.259 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.260 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.343 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.344 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.379 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.398 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.400 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:26:05 np0005546909 nova_compute[187208]: 2025-12-05 12:26:05.400 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:26:06 np0005546909 podman[244912]: 2025-12-05 12:26:06.24187057 +0000 UTC m=+0.056585792 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  5 07:26:06 np0005546909 podman[244913]: 2025-12-05 12:26:06.26306466 +0000 UTC m=+0.075761564 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:26:06 np0005546909 podman[244914]: 2025-12-05 12:26:06.29600709 +0000 UTC m=+0.104682128 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 07:26:09 np0005546909 nova_compute[187208]: 2025-12-05 12:26:09.663 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:09 np0005546909 nova_compute[187208]: 2025-12-05 12:26:09.683 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:10 np0005546909 nova_compute[187208]: 2025-12-05 12:26:10.396 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:26:14 np0005546909 nova_compute[187208]: 2025-12-05 12:26:14.675 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:14 np0005546909 nova_compute[187208]: 2025-12-05 12:26:14.684 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:17 np0005546909 podman[244981]: 2025-12-05 12:26:17.218966469 +0000 UTC m=+0.070235155 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:26:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:26:18.086 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:26:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:26:18.086 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:26:18 np0005546909 nova_compute[187208]: 2025-12-05 12:26:18.088 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:19 np0005546909 nova_compute[187208]: 2025-12-05 12:26:19.678 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:19 np0005546909 nova_compute[187208]: 2025-12-05 12:26:19.685 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:24 np0005546909 podman[245006]: 2025-12-05 12:26:24.204977893 +0000 UTC m=+0.056342385 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:26:24 np0005546909 nova_compute[187208]: 2025-12-05 12:26:24.681 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:24 np0005546909 nova_compute[187208]: 2025-12-05 12:26:24.687 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:28 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:26:28.089 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:26:28 np0005546909 podman[245026]: 2025-12-05 12:26:28.228515553 +0000 UTC m=+0.081562922 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Dec  5 07:26:28 np0005546909 podman[245027]: 2025-12-05 12:26:28.228526223 +0000 UTC m=+0.078196695 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  5 07:26:29 np0005546909 nova_compute[187208]: 2025-12-05 12:26:29.684 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:29 np0005546909 nova_compute[187208]: 2025-12-05 12:26:29.687 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:34 np0005546909 nova_compute[187208]: 2025-12-05 12:26:34.686 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:34 np0005546909 nova_compute[187208]: 2025-12-05 12:26:34.688 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.365 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:26:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:26:37 np0005546909 podman[245067]: 2025-12-05 12:26:37.193944567 +0000 UTC m=+0.047821399 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 07:26:37 np0005546909 podman[245066]: 2025-12-05 12:26:37.194306378 +0000 UTC m=+0.050541858 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd)
Dec  5 07:26:37 np0005546909 podman[245068]: 2025-12-05 12:26:37.233991301 +0000 UTC m=+0.083848067 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:26:39 np0005546909 nova_compute[187208]: 2025-12-05 12:26:39.690 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:26:39 np0005546909 nova_compute[187208]: 2025-12-05 12:26:39.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:26:39 np0005546909 nova_compute[187208]: 2025-12-05 12:26:39.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:26:39 np0005546909 nova_compute[187208]: 2025-12-05 12:26:39.691 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:26:39 np0005546909 nova_compute[187208]: 2025-12-05 12:26:39.824 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:39 np0005546909 nova_compute[187208]: 2025-12-05 12:26:39.824 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:26:44 np0005546909 nova_compute[187208]: 2025-12-05 12:26:44.825 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:26:48 np0005546909 podman[245134]: 2025-12-05 12:26:48.195925263 +0000 UTC m=+0.051032561 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:26:49 np0005546909 nova_compute[187208]: 2025-12-05 12:26:49.827 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:26:54 np0005546909 nova_compute[187208]: 2025-12-05 12:26:54.828 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:26:54 np0005546909 nova_compute[187208]: 2025-12-05 12:26:54.829 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:54 np0005546909 nova_compute[187208]: 2025-12-05 12:26:54.830 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:26:54 np0005546909 nova_compute[187208]: 2025-12-05 12:26:54.830 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:26:54 np0005546909 nova_compute[187208]: 2025-12-05 12:26:54.830 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:26:54 np0005546909 nova_compute[187208]: 2025-12-05 12:26:54.831 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:55 np0005546909 nova_compute[187208]: 2025-12-05 12:26:55.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:26:55 np0005546909 nova_compute[187208]: 2025-12-05 12:26:55.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:26:55 np0005546909 nova_compute[187208]: 2025-12-05 12:26:55.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:26:55 np0005546909 nova_compute[187208]: 2025-12-05 12:26:55.077 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:26:55 np0005546909 podman[245158]: 2025-12-05 12:26:55.206810893 +0000 UTC m=+0.060459644 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:26:56 np0005546909 nova_compute[187208]: 2025-12-05 12:26:56.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:26:56 np0005546909 nova_compute[187208]: 2025-12-05 12:26:56.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:26:57 np0005546909 nova_compute[187208]: 2025-12-05 12:26:57.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:26:59 np0005546909 podman[245180]: 2025-12-05 12:26:59.217602366 +0000 UTC m=+0.058193728 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec  5 07:26:59 np0005546909 podman[245179]: 2025-12-05 12:26:59.238412356 +0000 UTC m=+0.075970621 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec  5 07:26:59 np0005546909 nova_compute[187208]: 2025-12-05 12:26:59.830 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:26:59 np0005546909 nova_compute[187208]: 2025-12-05 12:26:59.832 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:00 np0005546909 nova_compute[187208]: 2025-12-05 12:27:00.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:27:01 np0005546909 nova_compute[187208]: 2025-12-05 12:27:01.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:27:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:27:03.032 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:27:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:27:03.033 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:27:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:27:03.033 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:27:03 np0005546909 nova_compute[187208]: 2025-12-05 12:27:03.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:27:04 np0005546909 nova_compute[187208]: 2025-12-05 12:27:04.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:27:04 np0005546909 nova_compute[187208]: 2025-12-05 12:27:04.834 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:27:04 np0005546909 nova_compute[187208]: 2025-12-05 12:27:04.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:27:04 np0005546909 nova_compute[187208]: 2025-12-05 12:27:04.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:27:04 np0005546909 nova_compute[187208]: 2025-12-05 12:27:04.836 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:27:04 np0005546909 nova_compute[187208]: 2025-12-05 12:27:04.877 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:04 np0005546909 nova_compute[187208]: 2025-12-05 12:27:04.878 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:27:06 np0005546909 nova_compute[187208]: 2025-12-05 12:27:06.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.093 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.094 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.094 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.094 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.231 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.232 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5708MB free_disk=73.04000854492188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.233 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.233 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.285 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.285 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.305 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.319 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.321 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:27:07 np0005546909 nova_compute[187208]: 2025-12-05 12:27:07.321 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:27:08 np0005546909 podman[245219]: 2025-12-05 12:27:08.222963101 +0000 UTC m=+0.058613240 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:27:08 np0005546909 podman[245218]: 2025-12-05 12:27:08.238287653 +0000 UTC m=+0.084884637 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  5 07:27:08 np0005546909 podman[245223]: 2025-12-05 12:27:08.251199965 +0000 UTC m=+0.086558526 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:27:09 np0005546909 nova_compute[187208]: 2025-12-05 12:27:09.878 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:27:09 np0005546909 nova_compute[187208]: 2025-12-05 12:27:09.879 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:09 np0005546909 nova_compute[187208]: 2025-12-05 12:27:09.879 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:27:09 np0005546909 nova_compute[187208]: 2025-12-05 12:27:09.879 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:27:09 np0005546909 nova_compute[187208]: 2025-12-05 12:27:09.880 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:27:09 np0005546909 nova_compute[187208]: 2025-12-05 12:27:09.881 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:14 np0005546909 nova_compute[187208]: 2025-12-05 12:27:14.883 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:27:14 np0005546909 nova_compute[187208]: 2025-12-05 12:27:14.885 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:27:14 np0005546909 nova_compute[187208]: 2025-12-05 12:27:14.885 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:27:14 np0005546909 nova_compute[187208]: 2025-12-05 12:27:14.886 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:27:14 np0005546909 nova_compute[187208]: 2025-12-05 12:27:14.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:14 np0005546909 nova_compute[187208]: 2025-12-05 12:27:14.908 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:27:19 np0005546909 podman[245288]: 2025-12-05 12:27:19.197534606 +0000 UTC m=+0.045781331 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:27:19 np0005546909 nova_compute[187208]: 2025-12-05 12:27:19.909 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:24 np0005546909 nova_compute[187208]: 2025-12-05 12:27:24.910 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:27:24 np0005546909 nova_compute[187208]: 2025-12-05 12:27:24.912 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:27:24 np0005546909 nova_compute[187208]: 2025-12-05 12:27:24.912 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:27:24 np0005546909 nova_compute[187208]: 2025-12-05 12:27:24.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:27:24 np0005546909 nova_compute[187208]: 2025-12-05 12:27:24.967 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:24 np0005546909 nova_compute[187208]: 2025-12-05 12:27:24.968 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:27:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:27:25.075 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:27:25 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:27:25.076 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:27:25 np0005546909 nova_compute[187208]: 2025-12-05 12:27:25.077 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:26 np0005546909 podman[245312]: 2025-12-05 12:27:26.187991226 +0000 UTC m=+0.046542582 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:27:29 np0005546909 nova_compute[187208]: 2025-12-05 12:27:29.968 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:30 np0005546909 podman[245334]: 2025-12-05 12:27:30.208086208 +0000 UTC m=+0.055403258 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec  5 07:27:30 np0005546909 podman[245333]: 2025-12-05 12:27:30.208286423 +0000 UTC m=+0.058109795 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, version=9.6, io.openshift.expose-services=)
Dec  5 07:27:32 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:27:32.078 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:27:34 np0005546909 nova_compute[187208]: 2025-12-05 12:27:34.969 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:39 np0005546909 podman[245374]: 2025-12-05 12:27:39.201089797 +0000 UTC m=+0.048972022 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:27:39 np0005546909 podman[245373]: 2025-12-05 12:27:39.205238096 +0000 UTC m=+0.057762466 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 07:27:39 np0005546909 podman[245375]: 2025-12-05 12:27:39.23660404 +0000 UTC m=+0.082038515 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:27:39 np0005546909 nova_compute[187208]: 2025-12-05 12:27:39.970 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:44 np0005546909 nova_compute[187208]: 2025-12-05 12:27:44.971 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:49 np0005546909 nova_compute[187208]: 2025-12-05 12:27:49.974 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:27:49 np0005546909 nova_compute[187208]: 2025-12-05 12:27:49.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:27:49 np0005546909 nova_compute[187208]: 2025-12-05 12:27:49.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:27:49 np0005546909 nova_compute[187208]: 2025-12-05 12:27:49.976 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:27:49 np0005546909 nova_compute[187208]: 2025-12-05 12:27:49.993 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:49 np0005546909 nova_compute[187208]: 2025-12-05 12:27:49.994 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:27:50 np0005546909 podman[245436]: 2025-12-05 12:27:50.195847153 +0000 UTC m=+0.049659052 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:27:54 np0005546909 nova_compute[187208]: 2025-12-05 12:27:54.995 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:54 np0005546909 nova_compute[187208]: 2025-12-05 12:27:54.998 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:27:57 np0005546909 podman[245463]: 2025-12-05 12:27:57.241429893 +0000 UTC m=+0.087849953 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Dec  5 07:27:57 np0005546909 nova_compute[187208]: 2025-12-05 12:27:57.322 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:27:57 np0005546909 nova_compute[187208]: 2025-12-05 12:27:57.323 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:27:57 np0005546909 nova_compute[187208]: 2025-12-05 12:27:57.323 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:27:57 np0005546909 nova_compute[187208]: 2025-12-05 12:27:57.495 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:27:58 np0005546909 nova_compute[187208]: 2025-12-05 12:27:58.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:27:58 np0005546909 nova_compute[187208]: 2025-12-05 12:27:58.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:27:59 np0005546909 nova_compute[187208]: 2025-12-05 12:27:59.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:28:00 np0005546909 nova_compute[187208]: 2025-12-05 12:27:59.998 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:28:01 np0005546909 podman[245485]: 2025-12-05 12:28:01.197993893 +0000 UTC m=+0.047211671 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec  5 07:28:01 np0005546909 podman[245484]: 2025-12-05 12:28:01.208358322 +0000 UTC m=+0.060866195 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=edpm, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Dec  5 07:28:02 np0005546909 nova_compute[187208]: 2025-12-05 12:28:02.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:28:02 np0005546909 nova_compute[187208]: 2025-12-05 12:28:02.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:28:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:28:03.033 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:28:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:28:03.034 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:28:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:28:03.034 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:28:04 np0005546909 nova_compute[187208]: 2025-12-05 12:28:04.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:28:05 np0005546909 nova_compute[187208]: 2025-12-05 12:28:05.000 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:28:05 np0005546909 nova_compute[187208]: 2025-12-05 12:28:05.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:28:05 np0005546909 nova_compute[187208]: 2025-12-05 12:28:05.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:28:05 np0005546909 nova_compute[187208]: 2025-12-05 12:28:05.002 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:28:05 np0005546909 nova_compute[187208]: 2025-12-05 12:28:05.003 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:28:05 np0005546909 nova_compute[187208]: 2025-12-05 12:28:05.004 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:28:05 np0005546909 nova_compute[187208]: 2025-12-05 12:28:05.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:28:07 np0005546909 nova_compute[187208]: 2025-12-05 12:28:07.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.088 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.089 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.245 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.246 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5717MB free_disk=73.04000854492188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.246 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.246 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.310 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.310 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.331 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.345 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.346 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:28:09 np0005546909 nova_compute[187208]: 2025-12-05 12:28:09.347 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:28:10 np0005546909 nova_compute[187208]: 2025-12-05 12:28:10.003 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:28:10 np0005546909 podman[245525]: 2025-12-05 12:28:10.19926066 +0000 UTC m=+0.051099104 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:28:10 np0005546909 podman[245524]: 2025-12-05 12:28:10.20617284 +0000 UTC m=+0.059883078 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  5 07:28:10 np0005546909 podman[245526]: 2025-12-05 12:28:10.264959704 +0000 UTC m=+0.113311947 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  5 07:28:11 np0005546909 nova_compute[187208]: 2025-12-05 12:28:11.343 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:28:15 np0005546909 nova_compute[187208]: 2025-12-05 12:28:15.004 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:28:20 np0005546909 nova_compute[187208]: 2025-12-05 12:28:20.006 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:28:21 np0005546909 podman[245593]: 2025-12-05 12:28:21.191176534 +0000 UTC m=+0.049671132 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:28:25 np0005546909 nova_compute[187208]: 2025-12-05 12:28:25.008 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:28:28 np0005546909 podman[245619]: 2025-12-05 12:28:28.214692577 +0000 UTC m=+0.067289820 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:28:30 np0005546909 nova_compute[187208]: 2025-12-05 12:28:30.010 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:28:32 np0005546909 podman[245639]: 2025-12-05 12:28:32.203226619 +0000 UTC m=+0.057466347 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec  5 07:28:32 np0005546909 podman[245640]: 2025-12-05 12:28:32.204885437 +0000 UTC m=+0.055122199 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  5 07:28:35 np0005546909 nova_compute[187208]: 2025-12-05 12:28:35.011 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.366 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:28:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:28:40 np0005546909 nova_compute[187208]: 2025-12-05 12:28:40.014 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:28:41 np0005546909 podman[245680]: 2025-12-05 12:28:41.213090305 +0000 UTC m=+0.057681763 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 07:28:41 np0005546909 podman[245679]: 2025-12-05 12:28:41.233549095 +0000 UTC m=+0.086625177 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible)
Dec  5 07:28:41 np0005546909 podman[245684]: 2025-12-05 12:28:41.255064135 +0000 UTC m=+0.094995059 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:28:45 np0005546909 nova_compute[187208]: 2025-12-05 12:28:45.019 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:28:45 np0005546909 nova_compute[187208]: 2025-12-05 12:28:45.022 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:28:45 np0005546909 nova_compute[187208]: 2025-12-05 12:28:45.022 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:28:45 np0005546909 nova_compute[187208]: 2025-12-05 12:28:45.022 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:28:45 np0005546909 nova_compute[187208]: 2025-12-05 12:28:45.025 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:28:45 np0005546909 nova_compute[187208]: 2025-12-05 12:28:45.025 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:28:50 np0005546909 nova_compute[187208]: 2025-12-05 12:28:50.055 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:28:50 np0005546909 nova_compute[187208]: 2025-12-05 12:28:50.056 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:28:50 np0005546909 nova_compute[187208]: 2025-12-05 12:28:50.056 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:28:50 np0005546909 nova_compute[187208]: 2025-12-05 12:28:50.056 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:28:50 np0005546909 nova_compute[187208]: 2025-12-05 12:28:50.057 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:28:52 np0005546909 podman[245749]: 2025-12-05 12:28:52.195538928 +0000 UTC m=+0.051113014 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:28:55 np0005546909 nova_compute[187208]: 2025-12-05 12:28:55.059 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:28:55 np0005546909 systemd-logind[792]: New session 26 of user zuul.
Dec  5 07:28:55 np0005546909 systemd[1]: Started Session 26 of User zuul.
Dec  5 07:28:58 np0005546909 nova_compute[187208]: 2025-12-05 12:28:58.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:28:58 np0005546909 nova_compute[187208]: 2025-12-05 12:28:58.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:28:58 np0005546909 podman[245915]: 2025-12-05 12:28:58.34939467 +0000 UTC m=+0.066233230 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec  5 07:28:59 np0005546909 nova_compute[187208]: 2025-12-05 12:28:59.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:28:59 np0005546909 nova_compute[187208]: 2025-12-05 12:28:59.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:28:59 np0005546909 nova_compute[187208]: 2025-12-05 12:28:59.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:28:59 np0005546909 nova_compute[187208]: 2025-12-05 12:28:59.074 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:29:00 np0005546909 nova_compute[187208]: 2025-12-05 12:29:00.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:29:00 np0005546909 nova_compute[187208]: 2025-12-05 12:29:00.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 07:29:00 np0005546909 nova_compute[187208]: 2025-12-05 12:29:00.061 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:29:00 np0005546909 ovs-vsctl[245967]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  5 07:29:01 np0005546909 nova_compute[187208]: 2025-12-05 12:29:01.072 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:29:01 np0005546909 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 245802 (sos)
Dec  5 07:29:01 np0005546909 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec  5 07:29:01 np0005546909 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec  5 07:29:01 np0005546909 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  5 07:29:01 np0005546909 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  5 07:29:01 np0005546909 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  5 07:29:02 np0005546909 nova_compute[187208]: 2025-12-05 12:29:02.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:29:02 np0005546909 podman[246184]: 2025-12-05 12:29:02.324865155 +0000 UTC m=+0.068601448 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec  5 07:29:02 np0005546909 podman[246177]: 2025-12-05 12:29:02.333250657 +0000 UTC m=+0.076648070 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Dec  5 07:29:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:29:03.034 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:29:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:29:03.035 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:29:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:29:03.036 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:29:03 np0005546909 nova_compute[187208]: 2025-12-05 12:29:03.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:29:05 np0005546909 nova_compute[187208]: 2025-12-05 12:29:05.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:29:05 np0005546909 nova_compute[187208]: 2025-12-05 12:29:05.062 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:05 np0005546909 systemd[1]: Starting Hostname Service...
Dec  5 07:29:05 np0005546909 systemd[1]: Started Hostname Service.
Dec  5 07:29:06 np0005546909 nova_compute[187208]: 2025-12-05 12:29:06.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:29:08 np0005546909 nova_compute[187208]: 2025-12-05 12:29:08.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:29:08 np0005546909 nova_compute[187208]: 2025-12-05 12:29:08.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:29:08 np0005546909 nova_compute[187208]: 2025-12-05 12:29:08.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 07:29:08 np0005546909 nova_compute[187208]: 2025-12-05 12:29:08.129 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 07:29:09 np0005546909 nova_compute[187208]: 2025-12-05 12:29:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.063 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.066 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.081 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.114 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.115 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.115 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.115 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.266 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.267 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5327MB free_disk=72.69318771362305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.268 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.268 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.508 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.508 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.581 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:29:10 np0005546909 nova_compute[187208]: 2025-12-05 12:29:10.655 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:29:12 np0005546909 nova_compute[187208]: 2025-12-05 12:29:12.157 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:29:12 np0005546909 nova_compute[187208]: 2025-12-05 12:29:12.157 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:29:12 np0005546909 podman[247392]: 2025-12-05 12:29:12.218049526 +0000 UTC m=+0.067874117 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 07:29:12 np0005546909 podman[247391]: 2025-12-05 12:29:12.223881094 +0000 UTC m=+0.073974812 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:29:12 np0005546909 podman[247394]: 2025-12-05 12:29:12.24350775 +0000 UTC m=+0.090301833 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Dec  5 07:29:13 np0005546909 ovs-appctl[247820]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  5 07:29:13 np0005546909 ovs-appctl[247824]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  5 07:29:13 np0005546909 ovs-appctl[247828]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  5 07:29:15 np0005546909 nova_compute[187208]: 2025-12-05 12:29:15.065 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:15 np0005546909 nova_compute[187208]: 2025-12-05 12:29:15.068 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:20 np0005546909 nova_compute[187208]: 2025-12-05 12:29:20.069 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:29:20 np0005546909 nova_compute[187208]: 2025-12-05 12:29:20.071 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:29:20 np0005546909 nova_compute[187208]: 2025-12-05 12:29:20.071 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:29:20 np0005546909 nova_compute[187208]: 2025-12-05 12:29:20.071 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:29:20 np0005546909 nova_compute[187208]: 2025-12-05 12:29:20.108 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:20 np0005546909 nova_compute[187208]: 2025-12-05 12:29:20.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:29:20 np0005546909 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  5 07:29:22 np0005546909 podman[249153]: 2025-12-05 12:29:22.301393958 +0000 UTC m=+0.060445193 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:29:22 np0005546909 systemd[1]: Starting Time & Date Service...
Dec  5 07:29:22 np0005546909 systemd[1]: Started Time & Date Service.
Dec  5 07:29:25 np0005546909 nova_compute[187208]: 2025-12-05 12:29:25.109 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:28 np0005546909 podman[249217]: 2025-12-05 12:29:28.497931548 +0000 UTC m=+0.066438795 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  5 07:29:30 np0005546909 nova_compute[187208]: 2025-12-05 12:29:30.111 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:29:30 np0005546909 nova_compute[187208]: 2025-12-05 12:29:30.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:30 np0005546909 nova_compute[187208]: 2025-12-05 12:29:30.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:29:30 np0005546909 nova_compute[187208]: 2025-12-05 12:29:30.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:29:30 np0005546909 nova_compute[187208]: 2025-12-05 12:29:30.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:29:30 np0005546909 nova_compute[187208]: 2025-12-05 12:29:30.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:29:32 np0005546909 podman[249239]: 2025-12-05 12:29:32.538688565 +0000 UTC m=+0.059539947 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:29:32 np0005546909 podman[249238]: 2025-12-05 12:29:32.545505271 +0000 UTC m=+0.068336920 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git)
Dec  5 07:29:35 np0005546909 nova_compute[187208]: 2025-12-05 12:29:35.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:35 np0005546909 nova_compute[187208]: 2025-12-05 12:29:35.116 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:40 np0005546909 nova_compute[187208]: 2025-12-05 12:29:40.116 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:29:40 np0005546909 nova_compute[187208]: 2025-12-05 12:29:40.118 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:29:40 np0005546909 nova_compute[187208]: 2025-12-05 12:29:40.118 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:29:40 np0005546909 nova_compute[187208]: 2025-12-05 12:29:40.118 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:29:40 np0005546909 nova_compute[187208]: 2025-12-05 12:29:40.119 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:29:40 np0005546909 nova_compute[187208]: 2025-12-05 12:29:40.120 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:42 np0005546909 podman[249277]: 2025-12-05 12:29:42.613514131 +0000 UTC m=+0.062425200 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:29:42 np0005546909 podman[249278]: 2025-12-05 12:29:42.622227762 +0000 UTC m=+0.066187159 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:29:42 np0005546909 podman[249279]: 2025-12-05 12:29:42.684346202 +0000 UTC m=+0.120566746 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  5 07:29:45 np0005546909 nova_compute[187208]: 2025-12-05 12:29:45.121 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:29:45 np0005546909 nova_compute[187208]: 2025-12-05 12:29:45.124 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:49 np0005546909 systemd[1]: session-26.scope: Deactivated successfully.
Dec  5 07:29:49 np0005546909 systemd[1]: session-26.scope: Consumed 1min 26.231s CPU time, 808.3M memory peak, read 324.0M from disk, written 31.3M to disk.
Dec  5 07:29:49 np0005546909 systemd-logind[792]: Session 26 logged out. Waiting for processes to exit.
Dec  5 07:29:49 np0005546909 systemd-logind[792]: Removed session 26.
Dec  5 07:29:49 np0005546909 systemd-logind[792]: New session 27 of user zuul.
Dec  5 07:29:49 np0005546909 systemd[1]: Started Session 27 of User zuul.
Dec  5 07:29:50 np0005546909 nova_compute[187208]: 2025-12-05 12:29:50.126 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:29:50 np0005546909 nova_compute[187208]: 2025-12-05 12:29:50.129 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:29:50 np0005546909 systemd[1]: session-27.scope: Deactivated successfully.
Dec  5 07:29:50 np0005546909 systemd-logind[792]: Session 27 logged out. Waiting for processes to exit.
Dec  5 07:29:50 np0005546909 systemd-logind[792]: Removed session 27.
Dec  5 07:29:50 np0005546909 systemd-logind[792]: New session 28 of user zuul.
Dec  5 07:29:50 np0005546909 systemd[1]: Started Session 28 of User zuul.
Dec  5 07:29:50 np0005546909 systemd[1]: session-28.scope: Deactivated successfully.
Dec  5 07:29:50 np0005546909 systemd-logind[792]: Session 28 logged out. Waiting for processes to exit.
Dec  5 07:29:50 np0005546909 systemd-logind[792]: Removed session 28.
Dec  5 07:29:52 np0005546909 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  5 07:29:52 np0005546909 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  5 07:29:52 np0005546909 podman[249401]: 2025-12-05 12:29:52.795046359 +0000 UTC m=+0.066411855 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:29:55 np0005546909 nova_compute[187208]: 2025-12-05 12:29:55.129 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:29:55 np0005546909 nova_compute[187208]: 2025-12-05 12:29:55.130 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:29:55 np0005546909 nova_compute[187208]: 2025-12-05 12:29:55.130 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:29:55 np0005546909 nova_compute[187208]: 2025-12-05 12:29:55.130 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:29:55 np0005546909 nova_compute[187208]: 2025-12-05 12:29:55.130 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:29:59 np0005546909 podman[249432]: 2025-12-05 12:29:59.206184946 +0000 UTC m=+0.059471185 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  5 07:30:00 np0005546909 nova_compute[187208]: 2025-12-05 12:30:00.132 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:30:00 np0005546909 nova_compute[187208]: 2025-12-05 12:30:00.134 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:30:00 np0005546909 nova_compute[187208]: 2025-12-05 12:30:00.134 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:30:00 np0005546909 nova_compute[187208]: 2025-12-05 12:30:00.134 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:30:00 np0005546909 nova_compute[187208]: 2025-12-05 12:30:00.160 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:00 np0005546909 nova_compute[187208]: 2025-12-05 12:30:00.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:30:01 np0005546909 nova_compute[187208]: 2025-12-05 12:30:01.136 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:30:01 np0005546909 nova_compute[187208]: 2025-12-05 12:30:01.137 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:30:01 np0005546909 nova_compute[187208]: 2025-12-05 12:30:01.137 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:30:01 np0005546909 nova_compute[187208]: 2025-12-05 12:30:01.159 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:30:01 np0005546909 nova_compute[187208]: 2025-12-05 12:30:01.159 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:30:01 np0005546909 nova_compute[187208]: 2025-12-05 12:30:01.159 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:30:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:30:03.036 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:30:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:30:03.037 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:30:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:30:03.037 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:30:03 np0005546909 nova_compute[187208]: 2025-12-05 12:30:03.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:30:03 np0005546909 nova_compute[187208]: 2025-12-05 12:30:03.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:30:03 np0005546909 podman[249455]: 2025-12-05 12:30:03.205455497 +0000 UTC m=+0.055460189 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec  5 07:30:03 np0005546909 podman[249454]: 2025-12-05 12:30:03.205353154 +0000 UTC m=+0.057811677 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, vcs-type=git, architecture=x86_64, config_id=edpm)
Dec  5 07:30:04 np0005546909 nova_compute[187208]: 2025-12-05 12:30:04.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:30:05 np0005546909 nova_compute[187208]: 2025-12-05 12:30:05.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:30:05 np0005546909 nova_compute[187208]: 2025-12-05 12:30:05.162 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:30:05 np0005546909 nova_compute[187208]: 2025-12-05 12:30:05.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:05 np0005546909 nova_compute[187208]: 2025-12-05 12:30:05.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:30:05 np0005546909 nova_compute[187208]: 2025-12-05 12:30:05.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:30:05 np0005546909 nova_compute[187208]: 2025-12-05 12:30:05.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:30:08 np0005546909 nova_compute[187208]: 2025-12-05 12:30:08.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:30:08 np0005546909 nova_compute[187208]: 2025-12-05 12:30:08.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:30:10 np0005546909 nova_compute[187208]: 2025-12-05 12:30:10.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:11 np0005546909 nova_compute[187208]: 2025-12-05 12:30:11.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.089 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.090 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.090 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.235 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.236 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5664MB free_disk=73.04059600830078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.236 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.236 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.479 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.480 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.503 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.523 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.523 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.547 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.569 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.592 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.608 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.631 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:30:12 np0005546909 nova_compute[187208]: 2025-12-05 12:30:12.632 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:30:13 np0005546909 podman[249496]: 2025-12-05 12:30:13.198655945 +0000 UTC m=+0.047911820 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 07:30:13 np0005546909 podman[249495]: 2025-12-05 12:30:13.198808439 +0000 UTC m=+0.050851244 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 07:30:13 np0005546909 podman[249497]: 2025-12-05 12:30:13.24225142 +0000 UTC m=+0.087258524 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:30:15 np0005546909 nova_compute[187208]: 2025-12-05 12:30:15.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:30:15 np0005546909 nova_compute[187208]: 2025-12-05 12:30:15.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:15 np0005546909 nova_compute[187208]: 2025-12-05 12:30:15.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:30:15 np0005546909 nova_compute[187208]: 2025-12-05 12:30:15.167 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:30:15 np0005546909 nova_compute[187208]: 2025-12-05 12:30:15.168 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:30:15 np0005546909 nova_compute[187208]: 2025-12-05 12:30:15.169 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:20 np0005546909 nova_compute[187208]: 2025-12-05 12:30:20.168 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:20 np0005546909 nova_compute[187208]: 2025-12-05 12:30:20.169 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:23 np0005546909 podman[249562]: 2025-12-05 12:30:23.192833348 +0000 UTC m=+0.051653767 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:30:25 np0005546909 nova_compute[187208]: 2025-12-05 12:30:25.169 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:30 np0005546909 nova_compute[187208]: 2025-12-05 12:30:30.171 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:30:30 np0005546909 nova_compute[187208]: 2025-12-05 12:30:30.172 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:30 np0005546909 nova_compute[187208]: 2025-12-05 12:30:30.172 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:30:30 np0005546909 nova_compute[187208]: 2025-12-05 12:30:30.172 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:30:30 np0005546909 nova_compute[187208]: 2025-12-05 12:30:30.172 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:30:30 np0005546909 nova_compute[187208]: 2025-12-05 12:30:30.173 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:30 np0005546909 podman[249586]: 2025-12-05 12:30:30.192967967 +0000 UTC m=+0.047176880 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:30:34 np0005546909 podman[249607]: 2025-12-05 12:30:34.206006547 +0000 UTC m=+0.052958344 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Dec  5 07:30:34 np0005546909 podman[249606]: 2025-12-05 12:30:34.207698706 +0000 UTC m=+0.058235925 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, architecture=x86_64)
Dec  5 07:30:35 np0005546909 nova_compute[187208]: 2025-12-05 12:30:35.174 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.367 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:30:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:30:40 np0005546909 nova_compute[187208]: 2025-12-05 12:30:40.175 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:30:40 np0005546909 nova_compute[187208]: 2025-12-05 12:30:40.177 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:30:40 np0005546909 nova_compute[187208]: 2025-12-05 12:30:40.177 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:30:40 np0005546909 nova_compute[187208]: 2025-12-05 12:30:40.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:30:40 np0005546909 nova_compute[187208]: 2025-12-05 12:30:40.225 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:40 np0005546909 nova_compute[187208]: 2025-12-05 12:30:40.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:30:44 np0005546909 podman[249641]: 2025-12-05 12:30:44.196814686 +0000 UTC m=+0.052001017 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  5 07:30:44 np0005546909 podman[249642]: 2025-12-05 12:30:44.197000451 +0000 UTC m=+0.048468616 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 07:30:44 np0005546909 podman[249643]: 2025-12-05 12:30:44.232192676 +0000 UTC m=+0.077329840 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  5 07:30:45 np0005546909 nova_compute[187208]: 2025-12-05 12:30:45.224 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:50 np0005546909 nova_compute[187208]: 2025-12-05 12:30:50.226 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:30:53 np0005546909 podman[249717]: 2025-12-05 12:30:53.906184433 +0000 UTC m=+0.078096823 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:30:55 np0005546909 nova_compute[187208]: 2025-12-05 12:30:55.231 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:30:55 np0005546909 nova_compute[187208]: 2025-12-05 12:30:55.272 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:30:55 np0005546909 nova_compute[187208]: 2025-12-05 12:30:55.273 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:30:55 np0005546909 nova_compute[187208]: 2025-12-05 12:30:55.273 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:30:55 np0005546909 nova_compute[187208]: 2025-12-05 12:30:55.274 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:30:55 np0005546909 nova_compute[187208]: 2025-12-05 12:30:55.274 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:30:55 np0005546909 nova_compute[187208]: 2025-12-05 12:30:55.275 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:30:55 np0005546909 nova_compute[187208]: 2025-12-05 12:30:55.276 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:00 np0005546909 nova_compute[187208]: 2025-12-05 12:31:00.277 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:31:00 np0005546909 nova_compute[187208]: 2025-12-05 12:31:00.280 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:31:00 np0005546909 nova_compute[187208]: 2025-12-05 12:31:00.280 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:31:00 np0005546909 nova_compute[187208]: 2025-12-05 12:31:00.280 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:31:00 np0005546909 nova_compute[187208]: 2025-12-05 12:31:00.310 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:00 np0005546909 nova_compute[187208]: 2025-12-05 12:31:00.311 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:31:00 np0005546909 nova_compute[187208]: 2025-12-05 12:31:00.632 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:31:00 np0005546909 nova_compute[187208]: 2025-12-05 12:31:00.633 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:31:00 np0005546909 nova_compute[187208]: 2025-12-05 12:31:00.633 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:31:00 np0005546909 nova_compute[187208]: 2025-12-05 12:31:00.649 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:31:01 np0005546909 nova_compute[187208]: 2025-12-05 12:31:01.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:31:01 np0005546909 nova_compute[187208]: 2025-12-05 12:31:01.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:31:01 np0005546909 podman[249746]: 2025-12-05 12:31:01.209975357 +0000 UTC m=+0.066160462 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:31:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:31:03.038 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:31:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:31:03.039 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:31:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:31:03.039 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:31:04 np0005546909 nova_compute[187208]: 2025-12-05 12:31:04.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:31:05 np0005546909 nova_compute[187208]: 2025-12-05 12:31:05.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:31:05 np0005546909 podman[249767]: 2025-12-05 12:31:05.196914261 +0000 UTC m=+0.043751271 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  5 07:31:05 np0005546909 podman[249766]: 2025-12-05 12:31:05.200813653 +0000 UTC m=+0.048526468 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm)
Dec  5 07:31:05 np0005546909 nova_compute[187208]: 2025-12-05 12:31:05.312 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:31:05 np0005546909 nova_compute[187208]: 2025-12-05 12:31:05.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:05 np0005546909 nova_compute[187208]: 2025-12-05 12:31:05.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:31:05 np0005546909 nova_compute[187208]: 2025-12-05 12:31:05.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:31:05 np0005546909 nova_compute[187208]: 2025-12-05 12:31:05.313 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:31:05 np0005546909 nova_compute[187208]: 2025-12-05 12:31:05.314 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:06 np0005546909 nova_compute[187208]: 2025-12-05 12:31:06.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:31:07 np0005546909 nova_compute[187208]: 2025-12-05 12:31:07.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:31:09 np0005546909 nova_compute[187208]: 2025-12-05 12:31:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:31:09 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:31:09 np0005546909 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  5 07:31:10 np0005546909 nova_compute[187208]: 2025-12-05 12:31:10.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:31:10 np0005546909 nova_compute[187208]: 2025-12-05 12:31:10.315 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.167 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.167 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.168 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.168 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.325 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.326 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5671MB free_disk=73.04075241088867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.326 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.327 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.463 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.463 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.529 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.553 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.555 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:31:14 np0005546909 nova_compute[187208]: 2025-12-05 12:31:14.555 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:31:15 np0005546909 podman[249805]: 2025-12-05 12:31:15.206176156 +0000 UTC m=+0.051785130 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:31:15 np0005546909 podman[249804]: 2025-12-05 12:31:15.208184444 +0000 UTC m=+0.057515835 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  5 07:31:15 np0005546909 podman[249806]: 2025-12-05 12:31:15.253790897 +0000 UTC m=+0.098935438 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  5 07:31:15 np0005546909 nova_compute[187208]: 2025-12-05 12:31:15.317 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:31:15 np0005546909 nova_compute[187208]: 2025-12-05 12:31:15.317 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:15 np0005546909 nova_compute[187208]: 2025-12-05 12:31:15.317 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:31:15 np0005546909 nova_compute[187208]: 2025-12-05 12:31:15.318 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:31:15 np0005546909 nova_compute[187208]: 2025-12-05 12:31:15.318 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:31:15 np0005546909 nova_compute[187208]: 2025-12-05 12:31:15.319 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:20 np0005546909 nova_compute[187208]: 2025-12-05 12:31:20.319 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:24 np0005546909 podman[249872]: 2025-12-05 12:31:24.19583079 +0000 UTC m=+0.046988884 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:31:25 np0005546909 nova_compute[187208]: 2025-12-05 12:31:25.321 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:31:30 np0005546909 nova_compute[187208]: 2025-12-05 12:31:30.325 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:31:30 np0005546909 nova_compute[187208]: 2025-12-05 12:31:30.327 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:31:30 np0005546909 nova_compute[187208]: 2025-12-05 12:31:30.327 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:31:30 np0005546909 nova_compute[187208]: 2025-12-05 12:31:30.327 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:31:30 np0005546909 nova_compute[187208]: 2025-12-05 12:31:30.355 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:30 np0005546909 nova_compute[187208]: 2025-12-05 12:31:30.356 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:31:32 np0005546909 podman[249896]: 2025-12-05 12:31:32.198792321 +0000 UTC m=+0.053421137 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec  5 07:31:35 np0005546909 nova_compute[187208]: 2025-12-05 12:31:35.357 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:36 np0005546909 podman[249915]: 2025-12-05 12:31:36.220917902 +0000 UTC m=+0.057838024 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:31:36 np0005546909 podman[249914]: 2025-12-05 12:31:36.220904961 +0000 UTC m=+0.063636509 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter)
Dec  5 07:31:40 np0005546909 nova_compute[187208]: 2025-12-05 12:31:40.358 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:45 np0005546909 nova_compute[187208]: 2025-12-05 12:31:45.360 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:46 np0005546909 podman[249955]: 2025-12-05 12:31:46.211600676 +0000 UTC m=+0.052798890 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:31:46 np0005546909 podman[249954]: 2025-12-05 12:31:46.218937135 +0000 UTC m=+0.065338477 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125)
Dec  5 07:31:46 np0005546909 podman[249956]: 2025-12-05 12:31:46.252076702 +0000 UTC m=+0.086991426 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:31:50 np0005546909 nova_compute[187208]: 2025-12-05 12:31:50.361 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:55 np0005546909 podman[250020]: 2025-12-05 12:31:55.202654548 +0000 UTC m=+0.058075800 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:31:55 np0005546909 nova_compute[187208]: 2025-12-05 12:31:55.363 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:31:55 np0005546909 nova_compute[187208]: 2025-12-05 12:31:55.365 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:31:55 np0005546909 nova_compute[187208]: 2025-12-05 12:31:55.365 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:31:55 np0005546909 nova_compute[187208]: 2025-12-05 12:31:55.365 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:31:55 np0005546909 nova_compute[187208]: 2025-12-05 12:31:55.400 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:31:55 np0005546909 nova_compute[187208]: 2025-12-05 12:31:55.401 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:32:00 np0005546909 nova_compute[187208]: 2025-12-05 12:32:00.402 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:32:00 np0005546909 nova_compute[187208]: 2025-12-05 12:32:00.556 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:32:00 np0005546909 nova_compute[187208]: 2025-12-05 12:32:00.557 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:32:00 np0005546909 nova_compute[187208]: 2025-12-05 12:32:00.557 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:32:00 np0005546909 nova_compute[187208]: 2025-12-05 12:32:00.573 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:32:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:32:03.040 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:32:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:32:03.040 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:32:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:32:03.040 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:32:03 np0005546909 nova_compute[187208]: 2025-12-05 12:32:03.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:32:03 np0005546909 nova_compute[187208]: 2025-12-05 12:32:03.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:32:03 np0005546909 podman[250044]: 2025-12-05 12:32:03.199441801 +0000 UTC m=+0.051212395 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:32:05 np0005546909 nova_compute[187208]: 2025-12-05 12:32:05.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:32:05 np0005546909 nova_compute[187208]: 2025-12-05 12:32:05.404 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:32:05 np0005546909 nova_compute[187208]: 2025-12-05 12:32:05.405 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:32:06 np0005546909 nova_compute[187208]: 2025-12-05 12:32:06.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:32:07 np0005546909 nova_compute[187208]: 2025-12-05 12:32:07.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:32:07 np0005546909 podman[250062]: 2025-12-05 12:32:07.198915845 +0000 UTC m=+0.056610629 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Dec  5 07:32:07 np0005546909 podman[250063]: 2025-12-05 12:32:07.211106953 +0000 UTC m=+0.059741648 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec  5 07:32:09 np0005546909 nova_compute[187208]: 2025-12-05 12:32:09.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:32:09 np0005546909 nova_compute[187208]: 2025-12-05 12:32:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:32:10 np0005546909 nova_compute[187208]: 2025-12-05 12:32:10.406 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:32:12 np0005546909 nova_compute[187208]: 2025-12-05 12:32:12.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:32:12 np0005546909 nova_compute[187208]: 2025-12-05 12:32:12.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:32:15 np0005546909 nova_compute[187208]: 2025-12-05 12:32:15.409 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.226 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.226 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.226 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.227 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.386 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.387 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5671MB free_disk=73.04075241088867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.388 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.388 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.448 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.448 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.471 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.483 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.485 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:32:16 np0005546909 nova_compute[187208]: 2025-12-05 12:32:16.485 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:32:17 np0005546909 podman[250102]: 2025-12-05 12:32:17.199534064 +0000 UTC m=+0.050675468 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:32:17 np0005546909 podman[250101]: 2025-12-05 12:32:17.204327881 +0000 UTC m=+0.059915543 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3)
Dec  5 07:32:17 np0005546909 podman[250103]: 2025-12-05 12:32:17.242166843 +0000 UTC m=+0.088678805 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:32:20 np0005546909 nova_compute[187208]: 2025-12-05 12:32:20.410 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:32:20 np0005546909 nova_compute[187208]: 2025-12-05 12:32:20.412 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:32:20 np0005546909 nova_compute[187208]: 2025-12-05 12:32:20.412 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:32:20 np0005546909 nova_compute[187208]: 2025-12-05 12:32:20.412 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:32:20 np0005546909 nova_compute[187208]: 2025-12-05 12:32:20.460 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:32:20 np0005546909 nova_compute[187208]: 2025-12-05 12:32:20.461 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:32:25 np0005546909 nova_compute[187208]: 2025-12-05 12:32:25.462 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:32:26 np0005546909 podman[250171]: 2025-12-05 12:32:26.196097615 +0000 UTC m=+0.050664758 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:32:30 np0005546909 nova_compute[187208]: 2025-12-05 12:32:30.464 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:32:34 np0005546909 podman[250195]: 2025-12-05 12:32:34.205388977 +0000 UTC m=+0.058245605 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:32:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:32:35 np0005546909 nova_compute[187208]: 2025-12-05 12:32:35.466 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:32:38 np0005546909 podman[250218]: 2025-12-05 12:32:38.203822531 +0000 UTC m=+0.050243586 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:32:38 np0005546909 podman[250217]: 2025-12-05 12:32:38.211802379 +0000 UTC m=+0.062106145 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible)
Dec  5 07:32:40 np0005546909 nova_compute[187208]: 2025-12-05 12:32:40.468 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:32:40 np0005546909 nova_compute[187208]: 2025-12-05 12:32:40.470 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:32:40 np0005546909 nova_compute[187208]: 2025-12-05 12:32:40.470 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:32:40 np0005546909 nova_compute[187208]: 2025-12-05 12:32:40.470 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:32:40 np0005546909 nova_compute[187208]: 2025-12-05 12:32:40.471 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:32:40 np0005546909 nova_compute[187208]: 2025-12-05 12:32:40.472 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:32:45 np0005546909 nova_compute[187208]: 2025-12-05 12:32:45.471 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:32:48 np0005546909 podman[250255]: 2025-12-05 12:32:48.208167055 +0000 UTC m=+0.058726159 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:32:48 np0005546909 podman[250256]: 2025-12-05 12:32:48.231107291 +0000 UTC m=+0.078932817 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:32:48 np0005546909 podman[250257]: 2025-12-05 12:32:48.239769338 +0000 UTC m=+0.083796025 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  5 07:32:50 np0005546909 nova_compute[187208]: 2025-12-05 12:32:50.473 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:32:55 np0005546909 nova_compute[187208]: 2025-12-05 12:32:55.474 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:32:57 np0005546909 podman[250323]: 2025-12-05 12:32:57.197569431 +0000 UTC m=+0.052729607 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:33:00 np0005546909 nova_compute[187208]: 2025-12-05 12:33:00.476 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:00 np0005546909 nova_compute[187208]: 2025-12-05 12:33:00.478 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:00 np0005546909 nova_compute[187208]: 2025-12-05 12:33:00.478 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:33:00 np0005546909 nova_compute[187208]: 2025-12-05 12:33:00.479 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:33:00 np0005546909 nova_compute[187208]: 2025-12-05 12:33:00.523 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:33:00 np0005546909 nova_compute[187208]: 2025-12-05 12:33:00.524 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:33:02 np0005546909 nova_compute[187208]: 2025-12-05 12:33:02.486 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:33:02 np0005546909 nova_compute[187208]: 2025-12-05 12:33:02.487 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:33:02 np0005546909 nova_compute[187208]: 2025-12-05 12:33:02.487 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:33:02 np0005546909 nova_compute[187208]: 2025-12-05 12:33:02.527 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:33:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:33:03.041 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:33:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:33:03.041 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:33:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:33:03.042 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:33:04 np0005546909 nova_compute[187208]: 2025-12-05 12:33:04.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:33:04 np0005546909 nova_compute[187208]: 2025-12-05 12:33:04.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:33:05 np0005546909 podman[250349]: 2025-12-05 12:33:05.195133979 +0000 UTC m=+0.050441932 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  5 07:33:05 np0005546909 nova_compute[187208]: 2025-12-05 12:33:05.524 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:33:07 np0005546909 nova_compute[187208]: 2025-12-05 12:33:07.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:33:08 np0005546909 nova_compute[187208]: 2025-12-05 12:33:08.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:33:08 np0005546909 nova_compute[187208]: 2025-12-05 12:33:08.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:33:09 np0005546909 nova_compute[187208]: 2025-12-05 12:33:09.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:33:09 np0005546909 podman[250370]: 2025-12-05 12:33:09.205670039 +0000 UTC m=+0.050211446 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  5 07:33:09 np0005546909 podman[250369]: 2025-12-05 12:33:09.209746235 +0000 UTC m=+0.060073887 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, vcs-type=git)
Dec  5 07:33:10 np0005546909 nova_compute[187208]: 2025-12-05 12:33:10.526 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:11 np0005546909 nova_compute[187208]: 2025-12-05 12:33:11.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:33:11 np0005546909 nova_compute[187208]: 2025-12-05 12:33:11.832 187212 DEBUG oslo_concurrency.processutils [None req-dbb20e6f-cbb5-4adf-8656-90ad61357de6 6b85417e2d5f492ab96282fdfe0b4f64 3df4e4eed3454c178c5281d12024579e - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  5 07:33:11 np0005546909 nova_compute[187208]: 2025-12-05 12:33:11.868 187212 DEBUG oslo_concurrency.processutils [None req-dbb20e6f-cbb5-4adf-8656-90ad61357de6 6b85417e2d5f492ab96282fdfe0b4f64 3df4e4eed3454c178c5281d12024579e - - default default] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  5 07:33:12 np0005546909 nova_compute[187208]: 2025-12-05 12:33:12.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:33:15 np0005546909 nova_compute[187208]: 2025-12-05 12:33:15.528 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:15 np0005546909 nova_compute[187208]: 2025-12-05 12:33:15.531 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:15 np0005546909 nova_compute[187208]: 2025-12-05 12:33:15.531 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:33:15 np0005546909 nova_compute[187208]: 2025-12-05 12:33:15.531 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:33:15 np0005546909 nova_compute[187208]: 2025-12-05 12:33:15.569 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:33:15 np0005546909 nova_compute[187208]: 2025-12-05 12:33:15.570 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.099 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.099 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.100 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.100 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.288 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.290 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5668MB free_disk=73.04075241088867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.291 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.291 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.378 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.379 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.416 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.439 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.440 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:33:17 np0005546909 nova_compute[187208]: 2025-12-05 12:33:17.441 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:33:18 np0005546909 nova_compute[187208]: 2025-12-05 12:33:18.184 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:33:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:33:18.184 104471 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:2d:f8', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'f2:db:9f:44:21:24'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  5 07:33:18 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:33:18.185 104471 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  5 07:33:19 np0005546909 podman[250411]: 2025-12-05 12:33:19.207051568 +0000 UTC m=+0.056245668 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 07:33:19 np0005546909 podman[250410]: 2025-12-05 12:33:19.207034737 +0000 UTC m=+0.062882017 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Dec  5 07:33:19 np0005546909 podman[250412]: 2025-12-05 12:33:19.240141653 +0000 UTC m=+0.082910170 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 07:33:20 np0005546909 nova_compute[187208]: 2025-12-05 12:33:20.571 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:33:20 np0005546909 nova_compute[187208]: 2025-12-05 12:33:20.572 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:33:25 np0005546909 nova_compute[187208]: 2025-12-05 12:33:25.574 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:25 np0005546909 nova_compute[187208]: 2025-12-05 12:33:25.576 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:25 np0005546909 nova_compute[187208]: 2025-12-05 12:33:25.576 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:33:25 np0005546909 nova_compute[187208]: 2025-12-05 12:33:25.576 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:33:25 np0005546909 nova_compute[187208]: 2025-12-05 12:33:25.611 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:33:25 np0005546909 nova_compute[187208]: 2025-12-05 12:33:25.612 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:33:28 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:33:28.187 104471 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2686fa45-e88c-4058-8865-e810ceb89d95, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  5 07:33:28 np0005546909 podman[250478]: 2025-12-05 12:33:28.19918396 +0000 UTC m=+0.056402532 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.060 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.061 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.062 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.062 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.062 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.063 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.079 187212 DEBUG nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.088 187212 DEBUG nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.088 187212 WARNING nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.088 187212 WARNING nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.088 187212 WARNING nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.089 187212 WARNING nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.089 187212 WARNING nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.089 187212 WARNING nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.089 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Removable base files: /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524 /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15 /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89 /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.090 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/c47971824929b7466134c539db51093d53350524#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.090 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/1e39c16656988ee114089078431239bf806417db#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.090 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3f7693dca94777de01080bcebdbe8d46d5f07f15#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.091 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/500e35ee2103f2c4f4a6ec1ea29df7bae1c51c89#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.091 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/4e67c74a736d89d49bae230086f8944c0448c13d#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.091 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/f6efb356df4fb897f7bcc70928d045e44798ba61#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.091 187212 DEBUG nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.091 187212 DEBUG nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.092 187212 DEBUG nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.092 187212 INFO nova.virt.libvirt.imagecache [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.613 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.615 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.615 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.615 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.648 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:33:30 np0005546909 nova_compute[187208]: 2025-12-05 12:33:30.649 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:33:35 np0005546909 nova_compute[187208]: 2025-12-05 12:33:35.651 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:35 np0005546909 nova_compute[187208]: 2025-12-05 12:33:35.697 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:35 np0005546909 nova_compute[187208]: 2025-12-05 12:33:35.697 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5047 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:33:35 np0005546909 nova_compute[187208]: 2025-12-05 12:33:35.697 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:33:35 np0005546909 nova_compute[187208]: 2025-12-05 12:33:35.698 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:33:35 np0005546909 nova_compute[187208]: 2025-12-05 12:33:35.700 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:36 np0005546909 podman[250502]: 2025-12-05 12:33:36.2280317 +0000 UTC m=+0.079145712 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:33:40 np0005546909 podman[250523]: 2025-12-05 12:33:40.211890618 +0000 UTC m=+0.056320641 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  5 07:33:40 np0005546909 podman[250522]: 2025-12-05 12:33:40.240113384 +0000 UTC m=+0.092114673 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Dec  5 07:33:40 np0005546909 nova_compute[187208]: 2025-12-05 12:33:40.702 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:40 np0005546909 nova_compute[187208]: 2025-12-05 12:33:40.704 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:40 np0005546909 nova_compute[187208]: 2025-12-05 12:33:40.704 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:33:40 np0005546909 nova_compute[187208]: 2025-12-05 12:33:40.704 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:33:40 np0005546909 nova_compute[187208]: 2025-12-05 12:33:40.739 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:33:40 np0005546909 nova_compute[187208]: 2025-12-05 12:33:40.740 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:33:45 np0005546909 nova_compute[187208]: 2025-12-05 12:33:45.740 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:33:50 np0005546909 podman[250559]: 2025-12-05 12:33:50.205954498 +0000 UTC m=+0.062240219 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Dec  5 07:33:50 np0005546909 podman[250560]: 2025-12-05 12:33:50.232127836 +0000 UTC m=+0.083905349 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 07:33:50 np0005546909 podman[250561]: 2025-12-05 12:33:50.241278397 +0000 UTC m=+0.087560463 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec  5 07:33:50 np0005546909 nova_compute[187208]: 2025-12-05 12:33:50.743 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:33:55 np0005546909 nova_compute[187208]: 2025-12-05 12:33:55.744 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:33:59 np0005546909 podman[250624]: 2025-12-05 12:33:59.193090699 +0000 UTC m=+0.049847395 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:34:00 np0005546909 nova_compute[187208]: 2025-12-05 12:34:00.746 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:34:00 np0005546909 nova_compute[187208]: 2025-12-05 12:34:00.747 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:00 np0005546909 nova_compute[187208]: 2025-12-05 12:34:00.747 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:34:00 np0005546909 nova_compute[187208]: 2025-12-05 12:34:00.747 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:34:00 np0005546909 nova_compute[187208]: 2025-12-05 12:34:00.747 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:34:00 np0005546909 nova_compute[187208]: 2025-12-05 12:34:00.748 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:02 np0005546909 nova_compute[187208]: 2025-12-05 12:34:02.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:02 np0005546909 nova_compute[187208]: 2025-12-05 12:34:02.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  5 07:34:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:34:03.042 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:34:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:34:03.042 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:34:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:34:03.043 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:34:04 np0005546909 nova_compute[187208]: 2025-12-05 12:34:04.178 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:04 np0005546909 nova_compute[187208]: 2025-12-05 12:34:04.179 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:34:04 np0005546909 nova_compute[187208]: 2025-12-05 12:34:04.179 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:34:04 np0005546909 nova_compute[187208]: 2025-12-05 12:34:04.205 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:34:05 np0005546909 nova_compute[187208]: 2025-12-05 12:34:05.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:05 np0005546909 nova_compute[187208]: 2025-12-05 12:34:05.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:34:05 np0005546909 nova_compute[187208]: 2025-12-05 12:34:05.748 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:07 np0005546909 nova_compute[187208]: 2025-12-05 12:34:07.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:07 np0005546909 podman[250649]: 2025-12-05 12:34:07.193924179 +0000 UTC m=+0.051473172 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:34:08 np0005546909 nova_compute[187208]: 2025-12-05 12:34:08.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:08 np0005546909 nova_compute[187208]: 2025-12-05 12:34:08.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  5 07:34:08 np0005546909 nova_compute[187208]: 2025-12-05 12:34:08.076 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  5 07:34:09 np0005546909 nova_compute[187208]: 2025-12-05 12:34:09.075 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:10 np0005546909 nova_compute[187208]: 2025-12-05 12:34:10.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:10 np0005546909 nova_compute[187208]: 2025-12-05 12:34:10.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:10 np0005546909 nova_compute[187208]: 2025-12-05 12:34:10.748 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:10 np0005546909 nova_compute[187208]: 2025-12-05 12:34:10.750 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:11 np0005546909 nova_compute[187208]: 2025-12-05 12:34:11.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:11 np0005546909 podman[250670]: 2025-12-05 12:34:11.235262078 +0000 UTC m=+0.070402883 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec  5 07:34:11 np0005546909 podman[250669]: 2025-12-05 12:34:11.235104774 +0000 UTC m=+0.073645286 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Dec  5 07:34:14 np0005546909 nova_compute[187208]: 2025-12-05 12:34:14.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:15 np0005546909 nova_compute[187208]: 2025-12-05 12:34:15.750 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:16 np0005546909 nova_compute[187208]: 2025-12-05 12:34:16.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.093 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.093 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.094 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.094 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.232 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.233 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5669MB free_disk=73.0407485961914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.233 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.233 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.292 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.292 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.371 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.386 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.388 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.388 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:34:19 np0005546909 nova_compute[187208]: 2025-12-05 12:34:19.389 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:20 np0005546909 nova_compute[187208]: 2025-12-05 12:34:20.752 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:21 np0005546909 podman[250708]: 2025-12-05 12:34:21.201780292 +0000 UTC m=+0.049457134 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 07:34:21 np0005546909 podman[250709]: 2025-12-05 12:34:21.228244068 +0000 UTC m=+0.075031125 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller)
Dec  5 07:34:21 np0005546909 podman[250707]: 2025-12-05 12:34:21.228043882 +0000 UTC m=+0.082765626 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec  5 07:34:25 np0005546909 nova_compute[187208]: 2025-12-05 12:34:25.754 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:34:30 np0005546909 podman[250774]: 2025-12-05 12:34:30.190494317 +0000 UTC m=+0.048326181 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:34:30 np0005546909 nova_compute[187208]: 2025-12-05 12:34:30.756 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:34:30 np0005546909 nova_compute[187208]: 2025-12-05 12:34:30.757 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:30 np0005546909 nova_compute[187208]: 2025-12-05 12:34:30.757 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:34:30 np0005546909 nova_compute[187208]: 2025-12-05 12:34:30.757 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:34:30 np0005546909 nova_compute[187208]: 2025-12-05 12:34:30.758 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:34:30 np0005546909 nova_compute[187208]: 2025-12-05 12:34:30.758 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.368 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:34:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:34:35 np0005546909 nova_compute[187208]: 2025-12-05 12:34:35.760 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:34:35 np0005546909 nova_compute[187208]: 2025-12-05 12:34:35.761 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:34:35 np0005546909 nova_compute[187208]: 2025-12-05 12:34:35.762 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:34:35 np0005546909 nova_compute[187208]: 2025-12-05 12:34:35.762 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:34:35 np0005546909 nova_compute[187208]: 2025-12-05 12:34:35.795 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:35 np0005546909 nova_compute[187208]: 2025-12-05 12:34:35.795 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:34:38 np0005546909 podman[250798]: 2025-12-05 12:34:38.201153939 +0000 UTC m=+0.052558633 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:34:40 np0005546909 nova_compute[187208]: 2025-12-05 12:34:40.797 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:34:40 np0005546909 nova_compute[187208]: 2025-12-05 12:34:40.799 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:40 np0005546909 nova_compute[187208]: 2025-12-05 12:34:40.799 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:34:40 np0005546909 nova_compute[187208]: 2025-12-05 12:34:40.799 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:34:40 np0005546909 nova_compute[187208]: 2025-12-05 12:34:40.800 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:34:40 np0005546909 nova_compute[187208]: 2025-12-05 12:34:40.802 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:34:42 np0005546909 podman[250818]: 2025-12-05 12:34:42.264411184 +0000 UTC m=+0.087708297 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git)
Dec  5 07:34:42 np0005546909 podman[250819]: 2025-12-05 12:34:42.279191257 +0000 UTC m=+0.099475554 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec  5 07:34:45 np0005546909 nova_compute[187208]: 2025-12-05 12:34:45.801 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:47 np0005546909 nova_compute[187208]: 2025-12-05 12:34:47.422 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:34:50 np0005546909 nova_compute[187208]: 2025-12-05 12:34:50.802 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:34:52 np0005546909 podman[250860]: 2025-12-05 12:34:52.206552292 +0000 UTC m=+0.053043017 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:34:52 np0005546909 podman[250859]: 2025-12-05 12:34:52.216079674 +0000 UTC m=+0.066968675 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0)
Dec  5 07:34:52 np0005546909 podman[250861]: 2025-12-05 12:34:52.235579941 +0000 UTC m=+0.079543954 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec  5 07:34:55 np0005546909 nova_compute[187208]: 2025-12-05 12:34:55.804 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:00 np0005546909 nova_compute[187208]: 2025-12-05 12:35:00.806 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:01 np0005546909 podman[250925]: 2025-12-05 12:35:01.193787837 +0000 UTC m=+0.048337972 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  5 07:35:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:35:03.043 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:35:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:35:03.044 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:35:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:35:03.044 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:35:05 np0005546909 nova_compute[187208]: 2025-12-05 12:35:05.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:05 np0005546909 nova_compute[187208]: 2025-12-05 12:35:05.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:35:05 np0005546909 nova_compute[187208]: 2025-12-05 12:35:05.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:35:05 np0005546909 nova_compute[187208]: 2025-12-05 12:35:05.074 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:35:05 np0005546909 nova_compute[187208]: 2025-12-05 12:35:05.075 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:05 np0005546909 nova_compute[187208]: 2025-12-05 12:35:05.075 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:35:05 np0005546909 nova_compute[187208]: 2025-12-05 12:35:05.808 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:09 np0005546909 nova_compute[187208]: 2025-12-05 12:35:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:09 np0005546909 nova_compute[187208]: 2025-12-05 12:35:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:09 np0005546909 podman[250949]: 2025-12-05 12:35:09.224942383 +0000 UTC m=+0.078996558 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec  5 07:35:10 np0005546909 nova_compute[187208]: 2025-12-05 12:35:10.810 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:11 np0005546909 nova_compute[187208]: 2025-12-05 12:35:11.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:12 np0005546909 nova_compute[187208]: 2025-12-05 12:35:12.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:12 np0005546909 nova_compute[187208]: 2025-12-05 12:35:12.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:13 np0005546909 podman[250970]: 2025-12-05 12:35:13.246392124 +0000 UTC m=+0.092004030 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec  5 07:35:13 np0005546909 podman[250971]: 2025-12-05 12:35:13.247606169 +0000 UTC m=+0.088262393 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  5 07:35:15 np0005546909 nova_compute[187208]: 2025-12-05 12:35:15.811 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:16 np0005546909 nova_compute[187208]: 2025-12-05 12:35:16.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.096 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.098 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.274 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.276 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5664MB free_disk=73.0407485961914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.276 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.276 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.427 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.427 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.442 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing inventories for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.550 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating ProviderTree inventory for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.550 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Updating inventory in ProviderTree for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.565 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing aggregate associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.583 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Refreshing trait associations for resource provider 5111707b-bdc3-4252-b5b7-b3e96ff05344, traits: COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SHA,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_FMA3,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_BMI2,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE2,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.605 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.621 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.622 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:35:19 np0005546909 nova_compute[187208]: 2025-12-05 12:35:19.622 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:35:20 np0005546909 nova_compute[187208]: 2025-12-05 12:35:20.813 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:23 np0005546909 podman[251012]: 2025-12-05 12:35:23.203035676 +0000 UTC m=+0.056046442 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  5 07:35:23 np0005546909 podman[251013]: 2025-12-05 12:35:23.223895782 +0000 UTC m=+0.072703618 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec  5 07:35:23 np0005546909 podman[251014]: 2025-12-05 12:35:23.256969257 +0000 UTC m=+0.101828650 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller)
Dec  5 07:35:25 np0005546909 nova_compute[187208]: 2025-12-05 12:35:25.814 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:30 np0005546909 nova_compute[187208]: 2025-12-05 12:35:30.816 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:32 np0005546909 podman[251077]: 2025-12-05 12:35:32.199152015 +0000 UTC m=+0.051400440 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  5 07:35:35 np0005546909 nova_compute[187208]: 2025-12-05 12:35:35.818 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:35 np0005546909 nova_compute[187208]: 2025-12-05 12:35:35.819 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:40 np0005546909 podman[251101]: 2025-12-05 12:35:40.202485426 +0000 UTC m=+0.053301184 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:35:40 np0005546909 nova_compute[187208]: 2025-12-05 12:35:40.820 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:35:40 np0005546909 nova_compute[187208]: 2025-12-05 12:35:40.821 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:40 np0005546909 nova_compute[187208]: 2025-12-05 12:35:40.821 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:35:40 np0005546909 nova_compute[187208]: 2025-12-05 12:35:40.822 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:35:40 np0005546909 nova_compute[187208]: 2025-12-05 12:35:40.822 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:35:40 np0005546909 nova_compute[187208]: 2025-12-05 12:35:40.824 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:35:44 np0005546909 podman[251121]: 2025-12-05 12:35:44.204286756 +0000 UTC m=+0.056494095 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc.)
Dec  5 07:35:44 np0005546909 podman[251122]: 2025-12-05 12:35:44.210169994 +0000 UTC m=+0.052079639 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  5 07:35:45 np0005546909 nova_compute[187208]: 2025-12-05 12:35:45.823 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:35:50 np0005546909 nova_compute[187208]: 2025-12-05 12:35:50.826 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:54 np0005546909 podman[251160]: 2025-12-05 12:35:54.207750865 +0000 UTC m=+0.058063760 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  5 07:35:54 np0005546909 podman[251161]: 2025-12-05 12:35:54.227359606 +0000 UTC m=+0.075099787 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 07:35:54 np0005546909 podman[251162]: 2025-12-05 12:35:54.23064353 +0000 UTC m=+0.072579035 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  5 07:35:55 np0005546909 nova_compute[187208]: 2025-12-05 12:35:55.826 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:35:55 np0005546909 nova_compute[187208]: 2025-12-05 12:35:55.828 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:00 np0005546909 nova_compute[187208]: 2025-12-05 12:36:00.828 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:36:03.046 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:36:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:36:03.046 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:36:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:36:03.046 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:36:03 np0005546909 podman[251226]: 2025-12-05 12:36:03.235057985 +0000 UTC m=+0.085370540 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:36:05 np0005546909 nova_compute[187208]: 2025-12-05 12:36:05.623 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:05 np0005546909 nova_compute[187208]: 2025-12-05 12:36:05.624 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:36:05 np0005546909 nova_compute[187208]: 2025-12-05 12:36:05.624 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:36:05 np0005546909 nova_compute[187208]: 2025-12-05 12:36:05.637 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:36:05 np0005546909 nova_compute[187208]: 2025-12-05 12:36:05.638 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:05 np0005546909 nova_compute[187208]: 2025-12-05 12:36:05.638 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:36:05 np0005546909 nova_compute[187208]: 2025-12-05 12:36:05.831 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:36:05 np0005546909 nova_compute[187208]: 2025-12-05 12:36:05.833 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:36:05 np0005546909 nova_compute[187208]: 2025-12-05 12:36:05.833 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:36:05 np0005546909 nova_compute[187208]: 2025-12-05 12:36:05.833 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:36:05 np0005546909 nova_compute[187208]: 2025-12-05 12:36:05.879 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:05 np0005546909 nova_compute[187208]: 2025-12-05 12:36:05.880 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:36:09 np0005546909 nova_compute[187208]: 2025-12-05 12:36:09.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:10 np0005546909 nova_compute[187208]: 2025-12-05 12:36:10.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:10 np0005546909 nova_compute[187208]: 2025-12-05 12:36:10.882 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:36:11 np0005546909 nova_compute[187208]: 2025-12-05 12:36:11.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:11 np0005546909 podman[251251]: 2025-12-05 12:36:11.20939284 +0000 UTC m=+0.059037618 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  5 07:36:12 np0005546909 nova_compute[187208]: 2025-12-05 12:36:12.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:12 np0005546909 nova_compute[187208]: 2025-12-05 12:36:12.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:15 np0005546909 podman[251271]: 2025-12-05 12:36:15.215452333 +0000 UTC m=+0.065622936 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec  5 07:36:15 np0005546909 podman[251272]: 2025-12-05 12:36:15.237177054 +0000 UTC m=+0.083021013 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  5 07:36:15 np0005546909 nova_compute[187208]: 2025-12-05 12:36:15.884 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:15 np0005546909 nova_compute[187208]: 2025-12-05 12:36:15.886 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:16 np0005546909 nova_compute[187208]: 2025-12-05 12:36:16.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.096 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.097 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.097 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.244 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.245 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5665MB free_disk=73.04133605957031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.246 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.246 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.349 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.350 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.372 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.384 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.386 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:36:19 np0005546909 nova_compute[187208]: 2025-12-05 12:36:19.386 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:36:20 np0005546909 nova_compute[187208]: 2025-12-05 12:36:20.380 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:36:20 np0005546909 nova_compute[187208]: 2025-12-05 12:36:20.887 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:36:20 np0005546909 nova_compute[187208]: 2025-12-05 12:36:20.889 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:36:20 np0005546909 nova_compute[187208]: 2025-12-05 12:36:20.889 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:36:20 np0005546909 nova_compute[187208]: 2025-12-05 12:36:20.889 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:36:20 np0005546909 nova_compute[187208]: 2025-12-05 12:36:20.891 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:20 np0005546909 nova_compute[187208]: 2025-12-05 12:36:20.891 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:36:25 np0005546909 podman[251311]: 2025-12-05 12:36:25.212048066 +0000 UTC m=+0.062951520 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  5 07:36:25 np0005546909 podman[251312]: 2025-12-05 12:36:25.235136346 +0000 UTC m=+0.083410605 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  5 07:36:25 np0005546909 podman[251313]: 2025-12-05 12:36:25.247841179 +0000 UTC m=+0.091087824 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller)
Dec  5 07:36:25 np0005546909 nova_compute[187208]: 2025-12-05 12:36:25.892 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:30 np0005546909 nova_compute[187208]: 2025-12-05 12:36:30.894 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:34 np0005546909 podman[251380]: 2025-12-05 12:36:34.203970563 +0000 UTC m=+0.057233756 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.369 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 ceilometer_agent_compute[197913]: 2025-12-05 12:36:35.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec  5 07:36:35 np0005546909 nova_compute[187208]: 2025-12-05 12:36:35.898 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:36:35 np0005546909 nova_compute[187208]: 2025-12-05 12:36:35.900 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:36:35 np0005546909 nova_compute[187208]: 2025-12-05 12:36:35.901 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:36:35 np0005546909 nova_compute[187208]: 2025-12-05 12:36:35.901 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:36:35 np0005546909 nova_compute[187208]: 2025-12-05 12:36:35.904 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:35 np0005546909 nova_compute[187208]: 2025-12-05 12:36:35.904 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:36:40 np0005546909 nova_compute[187208]: 2025-12-05 12:36:40.906 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:36:40 np0005546909 nova_compute[187208]: 2025-12-05 12:36:40.906 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:40 np0005546909 nova_compute[187208]: 2025-12-05 12:36:40.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:36:40 np0005546909 nova_compute[187208]: 2025-12-05 12:36:40.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:36:40 np0005546909 nova_compute[187208]: 2025-12-05 12:36:40.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:36:40 np0005546909 nova_compute[187208]: 2025-12-05 12:36:40.907 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:42 np0005546909 podman[251404]: 2025-12-05 12:36:42.210305542 +0000 UTC m=+0.064164234 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec  5 07:36:45 np0005546909 nova_compute[187208]: 2025-12-05 12:36:45.911 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:36:45 np0005546909 nova_compute[187208]: 2025-12-05 12:36:45.912 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:45 np0005546909 nova_compute[187208]: 2025-12-05 12:36:45.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:36:45 np0005546909 nova_compute[187208]: 2025-12-05 12:36:45.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:36:45 np0005546909 nova_compute[187208]: 2025-12-05 12:36:45.913 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:36:46 np0005546909 podman[251425]: 2025-12-05 12:36:46.011434919 +0000 UTC m=+0.064121374 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  5 07:36:46 np0005546909 podman[251426]: 2025-12-05 12:36:46.03388024 +0000 UTC m=+0.084664640 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  5 07:36:50 np0005546909 nova_compute[187208]: 2025-12-05 12:36:50.915 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:36:50 np0005546909 nova_compute[187208]: 2025-12-05 12:36:50.916 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:50 np0005546909 nova_compute[187208]: 2025-12-05 12:36:50.916 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:36:50 np0005546909 nova_compute[187208]: 2025-12-05 12:36:50.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:36:50 np0005546909 nova_compute[187208]: 2025-12-05 12:36:50.917 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:36:50 np0005546909 nova_compute[187208]: 2025-12-05 12:36:50.918 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:55 np0005546909 nova_compute[187208]: 2025-12-05 12:36:55.919 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:36:56 np0005546909 podman[251464]: 2025-12-05 12:36:56.204366181 +0000 UTC m=+0.055458966 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec  5 07:36:56 np0005546909 podman[251465]: 2025-12-05 12:36:56.22847942 +0000 UTC m=+0.078189895 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  5 07:36:56 np0005546909 podman[251463]: 2025-12-05 12:36:56.238930199 +0000 UTC m=+0.092138384 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125)
Dec  5 07:37:00 np0005546909 nova_compute[187208]: 2025-12-05 12:37:00.921 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:37:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:37:03.048 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:37:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:37:03.048 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:37:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:37:03.049 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:37:05 np0005546909 nova_compute[187208]: 2025-12-05 12:37:05.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:05 np0005546909 nova_compute[187208]: 2025-12-05 12:37:05.060 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:37:05 np0005546909 nova_compute[187208]: 2025-12-05 12:37:05.061 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:37:05 np0005546909 nova_compute[187208]: 2025-12-05 12:37:05.075 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:37:05 np0005546909 nova_compute[187208]: 2025-12-05 12:37:05.076 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:05 np0005546909 nova_compute[187208]: 2025-12-05 12:37:05.076 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:37:05 np0005546909 podman[251529]: 2025-12-05 12:37:05.206325216 +0000 UTC m=+0.056815374 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:37:05 np0005546909 nova_compute[187208]: 2025-12-05 12:37:05.923 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:37:09 np0005546909 nova_compute[187208]: 2025-12-05 12:37:09.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:10 np0005546909 nova_compute[187208]: 2025-12-05 12:37:10.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:10 np0005546909 nova_compute[187208]: 2025-12-05 12:37:10.924 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:37:12 np0005546909 nova_compute[187208]: 2025-12-05 12:37:12.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:13 np0005546909 nova_compute[187208]: 2025-12-05 12:37:13.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:13 np0005546909 nova_compute[187208]: 2025-12-05 12:37:13.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:13 np0005546909 podman[251554]: 2025-12-05 12:37:13.221347501 +0000 UTC m=+0.061698514 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm)
Dec  5 07:37:15 np0005546909 nova_compute[187208]: 2025-12-05 12:37:15.926 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:37:16 np0005546909 nova_compute[187208]: 2025-12-05 12:37:16.055 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:16 np0005546909 podman[251575]: 2025-12-05 12:37:16.211064624 +0000 UTC m=+0.056138375 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  5 07:37:16 np0005546909 podman[251574]: 2025-12-05 12:37:16.237768867 +0000 UTC m=+0.088197751 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350)
Dec  5 07:37:20 np0005546909 nova_compute[187208]: 2025-12-05 12:37:20.928 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:37:20 np0005546909 nova_compute[187208]: 2025-12-05 12:37:20.931 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:37:20 np0005546909 nova_compute[187208]: 2025-12-05 12:37:20.931 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:37:20 np0005546909 nova_compute[187208]: 2025-12-05 12:37:20.931 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.113 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.114 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.137 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.138 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.138 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.138 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.304 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.306 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5669MB free_disk=73.0394058227539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.306 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.306 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.615 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.616 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.639 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.758 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.760 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:37:21 np0005546909 nova_compute[187208]: 2025-12-05 12:37:21.760 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:37:26 np0005546909 nova_compute[187208]: 2025-12-05 12:37:26.115 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:37:27 np0005546909 podman[251617]: 2025-12-05 12:37:27.225089097 +0000 UTC m=+0.062428135 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 07:37:27 np0005546909 podman[251616]: 2025-12-05 12:37:27.225456247 +0000 UTC m=+0.067251163 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125)
Dec  5 07:37:27 np0005546909 podman[251618]: 2025-12-05 12:37:27.284983968 +0000 UTC m=+0.117368905 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  5 07:37:31 np0005546909 nova_compute[187208]: 2025-12-05 12:37:31.117 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:37:36 np0005546909 nova_compute[187208]: 2025-12-05 12:37:36.121 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:37:36 np0005546909 nova_compute[187208]: 2025-12-05 12:37:36.123 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:37:36 np0005546909 nova_compute[187208]: 2025-12-05 12:37:36.123 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:37:36 np0005546909 nova_compute[187208]: 2025-12-05 12:37:36.123 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:37:36 np0005546909 nova_compute[187208]: 2025-12-05 12:37:36.160 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:37:36 np0005546909 nova_compute[187208]: 2025-12-05 12:37:36.161 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:37:36 np0005546909 podman[251684]: 2025-12-05 12:37:36.243437979 +0000 UTC m=+0.051322737 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  5 07:37:41 np0005546909 nova_compute[187208]: 2025-12-05 12:37:41.162 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:37:41 np0005546909 nova_compute[187208]: 2025-12-05 12:37:41.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:37:41 np0005546909 nova_compute[187208]: 2025-12-05 12:37:41.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:37:41 np0005546909 nova_compute[187208]: 2025-12-05 12:37:41.163 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:37:41 np0005546909 nova_compute[187208]: 2025-12-05 12:37:41.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:37:41 np0005546909 nova_compute[187208]: 2025-12-05 12:37:41.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:37:44 np0005546909 podman[251709]: 2025-12-05 12:37:44.204142384 +0000 UTC m=+0.059443700 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  5 07:37:46 np0005546909 nova_compute[187208]: 2025-12-05 12:37:46.164 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:37:47 np0005546909 podman[251730]: 2025-12-05 12:37:47.209938974 +0000 UTC m=+0.052202852 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  5 07:37:47 np0005546909 podman[251729]: 2025-12-05 12:37:47.212766575 +0000 UTC m=+0.058620226 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Dec  5 07:37:51 np0005546909 nova_compute[187208]: 2025-12-05 12:37:51.166 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:37:56 np0005546909 nova_compute[187208]: 2025-12-05 12:37:56.169 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:37:58 np0005546909 podman[251771]: 2025-12-05 12:37:58.214815715 +0000 UTC m=+0.063052532 container health_status 164bd37296ada7ef72200d1f75eda4ea42b3caf08e6252bbd3fa7323281a37bb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec  5 07:37:58 np0005546909 podman[251772]: 2025-12-05 12:37:58.217564634 +0000 UTC m=+0.060164710 container health_status 5bed5e91b65380c5bbff1d8d6fbe400edc1cb44caf6622c14eb3a4a9fb781df5 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec  5 07:37:58 np0005546909 podman[251773]: 2025-12-05 12:37:58.248804386 +0000 UTC m=+0.089170288 container health_status 6ba51f9558d8370f83a89d8300805f9a512b34c36e6e8000c367517942343698 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Dec  5 07:38:01 np0005546909 nova_compute[187208]: 2025-12-05 12:38:01.170 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:38:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:38:03.049 104471 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:38:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:38:03.050 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:38:03 np0005546909 ovn_metadata_agent[104466]: 2025-12-05 12:38:03.050 104471 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:38:06 np0005546909 nova_compute[187208]: 2025-12-05 12:38:06.173 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:38:06 np0005546909 nova_compute[187208]: 2025-12-05 12:38:06.761 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:06 np0005546909 nova_compute[187208]: 2025-12-05 12:38:06.762 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  5 07:38:07 np0005546909 nova_compute[187208]: 2025-12-05 12:38:07.061 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:07 np0005546909 nova_compute[187208]: 2025-12-05 12:38:07.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  5 07:38:07 np0005546909 nova_compute[187208]: 2025-12-05 12:38:07.062 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  5 07:38:07 np0005546909 nova_compute[187208]: 2025-12-05 12:38:07.078 187212 DEBUG nova.compute.manager [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  5 07:38:07 np0005546909 podman[251843]: 2025-12-05 12:38:07.19337465 +0000 UTC m=+0.050549005 container health_status 55cb3cd79cff636b3a6ab4528e35f13c8f293bd13490bca9b251d9dfe70f3ed6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  5 07:38:11 np0005546909 nova_compute[187208]: 2025-12-05 12:38:11.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:11 np0005546909 nova_compute[187208]: 2025-12-05 12:38:11.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:11 np0005546909 nova_compute[187208]: 2025-12-05 12:38:11.175 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:38:12 np0005546909 systemd-logind[792]: New session 29 of user zuul.
Dec  5 07:38:12 np0005546909 systemd[1]: Started Session 29 of User zuul.
Dec  5 07:38:14 np0005546909 nova_compute[187208]: 2025-12-05 12:38:14.060 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:14 np0005546909 nova_compute[187208]: 2025-12-05 12:38:14.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:15 np0005546909 nova_compute[187208]: 2025-12-05 12:38:15.062 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:15 np0005546909 podman[252012]: 2025-12-05 12:38:15.178812931 +0000 UTC m=+0.069580879 container health_status 5b9d6ce01b329bba2bddabcab6f549f0652a91697ff9c747733859233e06ffa1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm)
Dec  5 07:38:16 np0005546909 nova_compute[187208]: 2025-12-05 12:38:16.178 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:38:16 np0005546909 nova_compute[187208]: 2025-12-05 12:38:16.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  5 07:38:16 np0005546909 nova_compute[187208]: 2025-12-05 12:38:16.180 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec  5 07:38:16 np0005546909 nova_compute[187208]: 2025-12-05 12:38:16.181 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:38:16 np0005546909 nova_compute[187208]: 2025-12-05 12:38:16.230 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:38:16 np0005546909 nova_compute[187208]: 2025-12-05 12:38:16.231 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  5 07:38:17 np0005546909 nova_compute[187208]: 2025-12-05 12:38:17.056 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:17 np0005546909 ovs-vsctl[252064]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  5 07:38:18 np0005546909 podman[252112]: 2025-12-05 12:38:18.225494421 +0000 UTC m=+0.067966583 container health_status 1f16523fd3d4f228955f3777c9cf70a4fe33aa8d91e6f8283c0a41fe6616ebd7 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Dec  5 07:38:18 np0005546909 podman[252113]: 2025-12-05 12:38:18.238834592 +0000 UTC m=+0.081382736 container health_status de6cf762c1cafb8e40db76f1b40976ba0e6c9d938ce9e25ceb93c20594f94edc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  5 07:38:18 np0005546909 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  5 07:38:18 np0005546909 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  5 07:38:18 np0005546909 virtqemud[186841]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.059 187212 DEBUG oslo_service.periodic_task [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.231 187212 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 36 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.500 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.501 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.501 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.502 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  5 07:38:21 np0005546909 systemd[1]: Starting Hostname Service...
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.656 187212 WARNING nova.virt.libvirt.driver [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.657 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5373MB free_disk=72.95220565795898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.658 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.658 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  5 07:38:21 np0005546909 systemd[1]: Started Hostname Service.
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.753 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.755 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.786 187212 DEBUG nova.compute.provider_tree [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed in ProviderTree for provider: 5111707b-bdc3-4252-b5b7-b3e96ff05344 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.806 187212 DEBUG nova.scheduler.client.report [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Inventory has not changed for provider 5111707b-bdc3-4252-b5b7-b3e96ff05344 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.828 187212 DEBUG nova.compute.resource_tracker [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  5 07:38:21 np0005546909 nova_compute[187208]: 2025-12-05 12:38:21.829 187212 DEBUG oslo_concurrency.lockutils [None req-51f46adb-490b-40b4-b21c-d46c3986f075 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
